datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
amishshah/slay | ---
dataset_info:
features:
- name: title
dtype: string
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 45166669.74
num_examples: 27000
- name: test
num_bytes: 5018518.86
num_examples: 3000
download_size: 27089400
dataset_size: 50185188.6
---
# Dataset Card for "slay"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ms57rd/THINGS-EEG_NICE-EEG_preprocessing | ---
license: apache-2.0
---
|
shi3z/ja_conv_wikipedia_orion14B_10K | ---
task_categories:
- conversational
language:
- ja
size_categories:
- 10K<n<100K
---
# Abstruct
This is a multi-turn conversation dataset generated from the Japanese Wikipedia dataset using Orion14B-Chat. Commercial use is possible, but the license is complicated, so please read it carefully before using it.
I generated V100x4 on 10 machines in about half a day.
# License
【Orion-14B Series】 Models Community License Agreement
https://huggingface.co/OrionStarAI/Orion-14B-Chat/blob/main/ModelsCommunityLicenseAgreement
# Computing
ABCI
https://abci.ai/ja/ |
open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B | ---
pretty_name: Evaluation run of princeton-nlp/Sheared-LLaMA-1.3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [princeton-nlp/Sheared-LLaMA-1.3B](https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T06:54:58.430499](https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B/blob/main/results_2023-10-25T06-54-58.430499.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.0003921042190298358,\n \"f1\": 0.045623951342281956,\n\
\ \"f1_stderr\": 0.0012088045479754918,\n \"acc\": 0.2954867628904967,\n\
\ \"acc_stderr\": 0.007847263403599461\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298358,\n\
\ \"f1\": 0.045623951342281956,\n \"f1_stderr\": 0.0012088045479754918\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.004548900682335102,\n \
\ \"acc_stderr\": 0.0018535550440036204\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5864246250986582,\n \"acc_stderr\": 0.013840971763195303\n\
\ }\n}\n```"
repo_url: https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|arc:challenge|25_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T06_54_58.430499
path:
- '**/details_harness|drop|3_2023-10-25T06-54-58.430499.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T06-54-58.430499.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T06_54_58.430499
path:
- '**/details_harness|gsm8k|5_2023-10-25T06-54-58.430499.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T06-54-58.430499.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hellaswag|10_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T06_54_58.430499
path:
- '**/details_harness|winogrande|5_2023-10-25T06-54-58.430499.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T06-54-58.430499.parquet'
- config_name: results
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- results_2023-10-10T21-37-25.489785.parquet
- split: 2023_10_25T06_54_58.430499
path:
- results_2023-10-25T06-54-58.430499.parquet
- split: latest
path:
- results_2023-10-25T06-54-58.430499.parquet
---
# Dataset Card for Evaluation run of princeton-nlp/Sheared-LLaMA-1.3B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [princeton-nlp/Sheared-LLaMA-1.3B](https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T06:54:58.430499](https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B/blob/main/results_2023-10-25T06-54-58.430499.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298358,
"f1": 0.045623951342281956,
"f1_stderr": 0.0012088045479754918,
"acc": 0.2954867628904967,
"acc_stderr": 0.007847263403599461
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298358,
"f1": 0.045623951342281956,
"f1_stderr": 0.0012088045479754918
},
"harness|gsm8k|5": {
"acc": 0.004548900682335102,
"acc_stderr": 0.0018535550440036204
},
"harness|winogrande|5": {
"acc": 0.5864246250986582,
"acc_stderr": 0.013840971763195303
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
SocialGrep/reddit-nonewnormal-complete | ---
annotations_creators:
- lexyr
language_creators:
- crowdsourced
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 1M<n<10M
source_datasets:
- original
paperswithcode_id: null
---
# Dataset Card for reddit-nonewnormal-complete
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://socialgrep.com/datasets](https://socialgrep.com/datasets?utm_source=huggingface&utm_medium=link&utm_campaign=dataset&utm_term=nonewnormal)
- **Point of Contact:** [Website](https://socialgrep.com/contact?utm_source=huggingface&utm_medium=link&utm_campaign=dataset&utm_term=nonewnormal)
### Dataset Summary
This corpus contains the complete data for the activity on subreddit /r/NoNewNormal for the entire duration of its existence.
### Languages
Mainly English.
## Dataset Structure
### Data Instances
A data point is a post or a comment. Due to the separate nature of the two, those exist in two different files - even though many fields are shared.
### Data Fields
- 'type': the type of the data point. Can be 'post' or 'comment'.
- 'id': the base-36 Reddit ID of the data point. Unique when combined with type.
- 'subreddit.id': the base-36 Reddit ID of the data point's host subreddit. Unique.
- 'subreddit.name': the human-readable name of the data point's host subreddit.
- 'subreddit.nsfw': a boolean marking the data point's host subreddit as NSFW or not.
- 'created_utc': a UTC timestamp for the data point.
- 'permalink': a reference link to the data point on Reddit.
- 'domain': (Post only) the domain of the data point's link.
- 'url': (Post only) the destination of the data point's link, if any.
- 'selftext': (Post only) the self-text of the data point, if any.
- 'title': (Post only) the title of the post data point.
- 'body': (Comment only) the body of the comment data point.
- 'sentiment': (Comment only) the result of an in-house sentiment analysis pipeline. Used for exploratory analysis.
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
CC-BY v4.0
### Contributions
[Needs More Information] |
amphora/QARV | ---
dataset_info:
features:
- name: Question
dtype: string
- name: Source
dtype: string
splits:
- name: train
num_bytes: 7971
num_examples: 104
download_size: 5407
dataset_size: 7971
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# QARV (Question and Answers with Regional Variance)
```EleutherAI Community Project```
The QARV (Question and Answers with Regional Variance) project aims to curate a collection of questions with answers that exhibit regional variations across different nations.
## Version
This is the first slice of the QARV dataset collected by (1) manual collection by humans or (2) GPT-4 Generation with RAG (wikipedia) and human filtering.
Collected questions undergo a second quality filtering (removal of inadequate questions and Yes/No questions) and deduplicaton. |
GATE-engine/automated_cardiac_diagnosis_competition.ACDC | ---
dataset_info:
features:
- name: four_d_img
sequence:
sequence:
sequence:
sequence: float32
- name: frame_data
list:
- name: img
sequence:
sequence:
sequence: float32
- name: label
sequence:
sequence:
sequence: int64
splits:
- name: train
num_bytes: 7089368208
num_examples: 100
- name: test
num_bytes: 3489827928
num_examples: 50
download_size: 363153048
dataset_size: 10579196136
---
# Dataset Card for "automated_cardiac_diagnosis_competition.ACDC"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
patruff/chucklesC1 | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 321805
num_examples: 2793
- name: test
num_bytes: 82089
num_examples: 699
download_size: 121653
dataset_size: 403894
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Safeer143/eli5_dataset_title_text_20k | ---
dataset_info:
features:
- name: text
dtype: string
- name: embeddings
sequence: float32
splits:
- name: train
num_bytes: 78426671
num_examples: 20000
download_size: 84756340
dataset_size: 78426671
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "eli5_dataset_title_text_20k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FFN/xosc2pic_Carla | ---
license: mit
---
|
sagteam/author_profiling | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- ru
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: The Corpus for the analysis of author profiling in Russian-language texts.
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-class-classification
- multi-label-classification
---
# Dataset Card for [author_profiling]
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/sag111/Author-Profiling
- **Repository:** https://github.com/sag111/Author-Profiling
- **Paper:** [Needs More Information]
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Sboev Alexander](mailto:sag111@mail.ru)
### Dataset Summary
The corpus for the author profiling analysis contains texts in Russian-language which labeled for 5 tasks:
1) gender -- 13448 texts with the labels, who wrote this: text female or male;
2) age -- 13448 texts with the labels, how old the person who wrote the text. This is a number from 12 to 80. In addition, for the classification task we added 5 age groups: 0-19; 20-29; 30-39; 40-49; 50+;
3) age imitation -- 8460 texts, where crowdsource authors is asked to write three texts:
a) in their natural manner,
b) imitating the style of someone younger,
c) imitating the style of someone older;
4) gender imitation -- 4988 texts, where the crowdsource authors is asked to write texts: in their origin gender and pretending to be the opposite gender;
5) style imitation -- 4988 texts, where crowdsource authors is asked to write a text on behalf of another person of your own gender, with a distortion of the authors usual style.
Dataset is collected sing the Yandex.Toloka service [link](https://toloka.yandex.ru/en).
You can read the data using the following python code:
```
def load_jsonl(input_path: str) -> list:
"""
Read list of objects from a JSON lines file.
"""
data = []
with open(input_path, 'r', encoding='utf-8') as f:
for line in f:
data.append(json.loads(line.rstrip('\n|\r')))
print('Loaded {} records from {}/n'.format(len(data), input_path))
return data
path_to_file = "./data/train.jsonl"
data = load_jsonl(path_to_file)
```
or you can use HuggingFace style:
```
from datasets import load_dataset
train_df = load_dataset('sagteam/author_profiling', split='train')
valid_df = load_dataset('sagteam/author_profiling', split='validation')
test_df = load_dataset('sagteam/author_profiling', split='test')
```
#### Here are some statistics:
1. For Train file:
- No. of documents -- 9564;
- No. of unique texts -- 9553;
- Text length in characters -- min: 197, max: 2984, mean: 500.5;
- No. of documents written -- by men: 4704, by women: 4860;
- No. of unique authors -- 2344; men: 1172, women: 1172;
- Age of the authors -- min: 13, max: 80, mean: 31.2;
- No. of documents by age group -- 0-19: 813, 20-29: 4188, 30-39: 2697, 40-49: 1194, 50+: 672;
- No. of documents with gender imitation: 1215; without gender imitation: 2430; not applicable: 5919;
- No. of documents with age imitation -- younger: 1973; older: 1973; without age imitation: 1973; not applicable: 3645;
- No. of documents with style imitation: 1215; without style imitation: 2430; not applicable: 5919.
2. For Valid file:
- No. of documents -- 1320;
- No. of unique texts -- 1316;
- Text length in characters -- min: 200, max: 2809, mean: 520.8;
- No. of documents written -- by men: 633, by women: 687;
- No. of unique authors -- 336; men: 168, women: 168;
- Age of the authors -- min: 15, max: 79, mean: 32.2;
- No. of documents by age group -- 1-19: 117, 20-29: 570, 30-39: 339, 40-49: 362, 50+: 132;
- No. of documents with gender imitation: 156; without gender imitation: 312; not applicable: 852;
- No. of documents with age imitation -- younger: 284; older: 284; without age imitation: 284; not applicable: 468;
- No. of documents with style imitation: 156; without style imitation: 312; not applicable: 852.
3. For Test file:
- No. of documents -- 2564;
- No. of unique texts -- 2561;
- Text length in characters -- min: 199, max: 3981, mean: 515.6;
- No. of documents written -- by men: 1290, by women: 1274;
- No. of unique authors -- 672; men: 336, women: 336;
- Age of the authors -- min: 12, max: 67, mean: 31.8;
- No. of documents by age group -- 1-19: 195, 20-29: 1131, 30-39: 683, 40-49: 351, 50+: 204;
- No. of documents with gender imitation: 292; without gender imitation: 583; not applicable: 1689;
- No. of documents with age imitation -- younger: 563; older: 563; without age imitation: 563; not applicable: 875;
- No. of documents with style imitation: 292; without style imitation: 583; not applicable: 1689.
### Supported Tasks and Leaderboards
This dataset is intended for multi-class and multi-label text classification.
The baseline models currently achieve the following F1-weighted metrics scores (table):
| Model name | gender | age_group | gender_imitation | age_imitation | style_imitation | no_imitation | average |
| ------------------- | ------ | --------- | ---------------- | ------------- | --------------- | ------------ | ------- |
| Dummy-stratified | 0.49 | 0.29 | 0.56 | 0.32 | 0.57 | 0.55 | 0.46 |
| Dummy-uniform | 0.49 | 0.23 | 0.51 | 0.32 | 0.51 | 0.51 | 0.43 |
| Dummy-most_frequent | 0.34 | 0.27 | 0.53 | 0.17 | 0.53 | 0.53 | 0.40 |
| LinearSVC + TF-IDF | 0.67 | 0.37 | 0.62 | 0.72 | 0.71 | 0.71 | 0.63 |
### Languages
The text in the dataset is in Russian.
## Dataset Structure
### Data Instances
Each instance is a text in Russian with some author profiling annotations.
An example for an instance from the dataset is shown below:
```
{
'id': 'crowdsource_4916',
'text': 'Ты очень симпатичный, Я давно не с кем не встречалась. Ты мне сильно понравился, ты умный интересный и удивительный, приходи ко мне в гости , у меня есть вкусное вино , и приготовлю вкусный ужин, посидим пообщаемся, узнаем друг друга поближе.',
'account_id': 'account_#1239',
'author_id': 411,
'age': 22,
'age_group': '20-29',
'gender': 'male',
'no_imitation': 'with_any_imitation',
'age_imitation': 'None',
'gender_imitation': 'with_gender_imitation',
'style_imitation': 'no_style_imitation'
}
```
### Data Fields
Data Fields includes:
- id -- unique identifier of the sample;
- text -- authors text written by a crowdsourcing user;
- author_id -- unique identifier of the user;
- account_id -- unique identifier of the crowdsource account;
- age -- age annotations;
- age_group -- age group annotations;
- no_imitation -- imitation annotations.
Label codes:
- 'with_any_imitation' -- there is some imitation in the text;
- 'no_any_imitation' -- the text is written without any imitation
- age_imitation -- age imitation annotations.
Label codes:
- 'younger' -- someone younger than the author is imitated in the text;
- 'older' -- someone older than the author is imitated in the text;
- 'no_age_imitation' -- the text is written without age imitation;
- 'None' -- not supported (the text was not written for this task)
- gender_imitation -- gender imitation annotations.
Label codes:
- 'no_gender_imitation' -- the text is written without gender imitation;
- 'with_gender_imitation' -- the text is written with a gender imitation;
- 'None' -- not supported (the text was not written for this task)
- style_imitation -- style imitation annotations.
Label codes:
- 'no_style_imitation' -- the text is written without style imitation;
- 'with_style_imitation' -- the text is written with a style imitation;
- 'None' -- not supported (the text was not written for this task).
### Data Splits
The dataset includes a set of train/valid/test splits with 9564, 1320 and 2564 texts respectively.
The unique authors do not overlap between the splits.
## Dataset Creation
### Curation Rationale
The formed dataset of examples consists of texts in Russian using a crowdsourcing platform. The created dataset can be used to improve the accuracy of supervised classifiers in author profiling tasks.
### Source Data
#### Initial Data Collection and Normalization
Data was collected from crowdsource platform. Each text was written by the author specifically for the task provided.
#### Who are the source language producers?
Russian-speaking Yandex.Toloka users.
### Annotations
#### Annotation process
We used a crowdsourcing platform to collect texts. Each respondent is asked to fill a questionnaire including their gender, age and native language.
For age imitation task the respondents are to choose a
topic out of a few suggested, and write three texts on it:
1) Text in their natural manner;
2) Text imitating the style of someone younger;
3) Text imitating the style of someone older.
For gender and style imitation task each author wrote three texts in certain different styles:
1) Text in the authors natural style;
2) Text imitating other gender style;
3) Text in a different style but without gender imitation.
The topics to choose from are the following.
- An attempt to persuade some arbitrary listener to meet the respondent at their place;
- A story about some memorable event/acquisition/rumour or whatever else the imaginary listener is supposed to enjoy;
- A story about oneself or about someone else, aiming to please the listener and win their favour;
- A description of oneself and one’s potential partner for a dating site;
- An attempt to persuade an unfamiliar person to come;
- A negative tour review.
The task does not pass checking and is considered improper work if it contains:
- Irrelevant answers to the questionnaire;
- Incoherent jumble of words;
- Chunks of text borrowed from somewhere else;
- Texts not conforming to the above list of topics.
Texts checking is performed firstly by automated search for borrowings (by an anti-plagiarism website), and then by manual review of compliance to the task.
#### Who are the annotators?
Russian-speaking Yandex.Toloka users.
### Personal and Sensitive Information
All personal data was anonymized. Each author has been assigned an impersonal, unique identifier.
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
Researchers at AI technology lab at NRC "Kurchatov Institute". See the [website](https://sagteam.ru/).
### Licensing Information
Apache License 2.0.
### Citation Information
If you have found our results helpful in your work, feel free to cite our publication.
```
@article{сбоев2022сравнение,
title={СРАВНЕНИЕ ТОЧНОСТЕЙ МЕТОДОВ НА ОСНОВЕ ЯЗЫКОВЫХ И ГРАФОВЫХ НЕЙРОСЕТЕВЫХ МОДЕЛЕЙ ДЛЯ ОПРЕДЕЛЕНИЯ ПРИЗНАКОВ АВТОРСКОГО ПРОФИЛЯ ПО ТЕКСТАМ НА РУССКОМ ЯЗЫКЕ},
author={Сбоев, АГ and Молошников, ИА and Рыбка, РБ and Наумов, АВ and Селиванов, АА},
journal={Вестник Национального исследовательского ядерного университета МИФИ},
volume={10},
number={6},
pages={529--539},
year={2021},
publisher={Общество с ограниченной ответственностью МАИК "Наука/Интерпериодика"}
}
```
### Contributions
Thanks to [@naumov-al](https://github.com/naumov-al) for adding this dataset.
|
safecantonese/cantomap | ---
pretty_name: CantoMap
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- yue
license:
- gpl-3.0
multilinguality:
- monolingual
---
# Dataset Card for CantoMap
## Dataset Description
- **Homepage:** https://github.com/gwinterstein/CantoMap/
- **Repository:** https://github.com/gwinterstein/CantoMap/
- **Paper:** http://www.lrec-conf.org/proceedings/lrec2020/pdf/2020.lrec-1.355.pdf
### Dataset Summary
The Common Voice dataset consists of a unique MP3 and corresponding text file.
Many of the 30328 recorded hours in the dataset also include demographic metadata like age, sex, and accent
that can help improve the accuracy of speech recognition engines.
The dataset currently consists of 19673 validated hours in 120 languages, but more voices and languages are always added.
Take a look at the [Languages](https://commonvoice.mozilla.org/en/languages) page to request a language or start contributing.
### Languages
```
Cantonese
```
## How to use
The `datasets` library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the `load_dataset` function.
For example, to download the Cantonese config, simply specify the corresponding language config name (i.e., "yue" for Cantonese):
```python
from datasets import load_dataset
cv_16 = load_dataset("safecantonese/cantomap", "yue", split="train")
```
Using the datasets library, you can also stream the dataset on-the-fly by adding a `streaming=True` argument to the `load_dataset` function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.
```python
from datasets import load_dataset
cv_16 = load_dataset("safecantonese/cantomap", "yue", split="train", streaming=True)
print(next(iter(cv_16)))
```
*Bonus*: create a [PyTorch dataloader](https://huggingface.co/docs/datasets/use_with_pytorch) directly with your own datasets (local/streamed).
### Local
```python
from datasets import load_dataset
from torch.utils.data.sampler import BatchSampler, RandomSampler
cv_16 = load_dataset("safecantonese/cantomap", "yue", split="train")
batch_sampler = BatchSampler(RandomSampler(cv_16), batch_size=32, drop_last=False)
dataloader = DataLoader(cv_16, batch_sampler=batch_sampler)
```
### Streaming
```python
from datasets import load_dataset
from torch.utils.data import DataLoader
cv_16 = load_dataset("safecantonese/cantomap", "yue", split="train")
dataloader = DataLoader(cv_16, batch_size=32)
```
To find out more about loading and preparing audio datasets, head over to [hf.co/blog/audio-datasets](https://huggingface.co/blog/audio-datasets).
### Example scripts
Train your own CTC or Seq2Seq Automatic Speech Recognition models on CantoMap with `transformers` - [here](https://github.com/huggingface/transformers/tree/main/examples/pytorch/speech-recognition).
## Dataset Structure
### Data Instances
A typical data point comprises the `path` to the audio file and its `sentence`.
```python
{
'path': 'et/clips/common_voice_et_18318995.mp3',
'audio': {
'path': 'et/clips/common_voice_et_18318995.mp3',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 48000
},
'sentence': 'Tasub kokku saada inimestega, keda tunned juba ammust ajast saati.',
}
```
### Data Fields
`path` (`string`): The path to the audio file
`audio` (`dict`): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
`sentence` (`string`): The sentence the user was prompted to speak
### Data Splits
The speech material has been subdivided into portions for train and test.
## Additional Information
### Licensing Information
gpl-3.0
### Citation Information
```
@inproceedings{lrec:2020,
author = {Winterstein, Grégoire, Tang, Carmen and Lai, Regine},
title = {CantoMap: a Hong Kong Cantonese MapTask Corpus}
}
```
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/c840440e | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 36
num_examples: 2
download_size: 1264
dataset_size: 36
---
# Dataset Card for "c840440e"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fathyshalab/massive_transport-de | ---
dataset_info:
features:
- name: id
dtype: string
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 36885
num_examples: 571
- name: validation
num_bytes: 7175
num_examples: 110
- name: test
num_bytes: 7787
num_examples: 124
download_size: 28802
dataset_size: 51847
---
# Dataset Card for "massive_transport-de"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jayaprakash008/ai_voice | ---
license: unknown
---
|
Thanmay/boolq-ml | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: bool
- name: passage
dtype: string
- name: itv2 ml question
dtype: string
- name: itv2 ml passage
dtype: string
splits:
- name: validation
num_bytes: 7369040
num_examples: 3270
download_size: 3269244
dataset_size: 7369040
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
nazimali/quran-question-answer-context | ---
dataset_info:
features:
- name: q_id
dtype: int64
- name: question
dtype: string
- name: answer
dtype: string
- name: q_word
dtype: string
- name: q_topic
dtype: string
- name: fine_class
dtype: string
- name: class
dtype: string
- name: ontology_concept
dtype: string
- name: ontology_concept2
dtype: string
- name: source
dtype: string
- name: q_src_id
dtype: int64
- name: quetion_type
dtype: string
- name: chapter_name
dtype: string
- name: chapter_no
dtype: int64
- name: verse
sequence: string
- name: question_en
dtype: string
- name: answer_en
dtype: string
- name: q_word_en
dtype: string
- name: q_topic_en
dtype: string
- name: fine_class_en
dtype: string
- name: class_en
dtype: string
- name: ontology_concept_en
dtype: string
- name: chapter_name_en
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 2226830.0310711367
num_examples: 978
- name: test
num_bytes: 557845.9689288634
num_examples: 245
download_size: 1515128
dataset_size: 2784676.0
license: cc-by-4.0
task_categories:
- question-answering
pretty_name: Quran Question Answer with Context
language:
- ar
- en
tags:
- islam
- quran
- arabic
---
# Dataset Card for "quran-question-answer-context"
## Dataset Summary
Translated the original dataset from Arabic to English and added the Surah ayahs to the `context` column.
## Usage
```python
from datasets import load_dataset
dataset = load_dataset("nazimali/quran-question-answer-context")
```
```python
DatasetDict({
train: Dataset({
features: ['q_id', 'question', 'answer', 'q_word', 'q_topic', 'fine_class', 'class', 'ontology_concept', 'ontology_concept2', 'source', 'q_src_id', 'quetion_type', 'chapter_name', 'chapter_no', 'verse', 'question_en', 'answer_en', 'q_word_en', 'q_topic_en', 'fine_class_en', 'class_en', 'ontology_concept_en', 'chapter_name_en', 'context'],
num_rows: 978
})
test: Dataset({
features: ['q_id', 'question', 'answer', 'q_word', 'q_topic', 'fine_class', 'class', 'ontology_concept', 'ontology_concept2', 'source', 'q_src_id', 'quetion_type', 'chapter_name', 'chapter_no', 'verse', 'question_en', 'answer_en', 'q_word_en', 'q_topic_en', 'fine_class_en', 'class_en', 'ontology_concept_en', 'chapter_name_en', 'context'],
num_rows: 245
})
})
```
## Translation Info
1. Translated the Arabic questions/concept columns to English with [Helsinki-NLP/opus-mt-ar-en](https://huggingface.co/Helsinki-NLP/opus-mt-ar-en)
2. Used `en-yusufali` translations for ayas [M-AI-C/quran-en-tafssirs](https://huggingface.co/datasets/M-AI-C/quran-en-tafssirs)
3. Renamed Surahs with [kheder/quran](https://huggingface.co/datasets/kheder/quran)
4. Added the ayahs that helped answer the questions
- Split the `ayah` columns string into a list of integers
- Concactenated the Surah:Ayah pairs into a sentence to the `context` column
Columns with the suffix `_en` contain the translations of the original columns.
## TODO
The `context` column has some `null` values that needs to be investigated and fixed
## Initial Data Collection
The original dataset is from **[Annotated Corpus of Arabic Al-Quran Question and Answer](https://archive.researchdata.leeds.ac.uk/464/)**
## Licensing Information
Original dataset [license](https://archive.researchdata.leeds.ac.uk/464/): **Creative Commons Attribution 4.0 International (CC BY 4.0)**
### Contributions
Original paper authors: Alqahtani, Mohammad and Atwell, Eric (2018) Annotated Corpus of Arabic Al-Quran Question and Answer. University of Leeds. https://doi.org/10.5518/356 |
Dharil/Dataset-IN | ---
dataset_info:
features:
- name: Judgements
dtype: string
- name: Summary
dtype: string
splits:
- name: train
num_bytes: 257469
num_examples: 10
download_size: 135267
dataset_size: 257469
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Marchanjo/spider-en-pt-es-fr-enr-enb | ---
license: cc-by-sa-4.0
---
Distributed under the Creative Commons-by-sa-4.0 respecting the ShareAlike of the [Spider Dataset](https://yale-lily.github.io/spider).
Code explanations and links for the model's checkpoints and datasets are on Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql)
Here is the [Hugging Face collection](https://huggingface.co/collections/Marchanjo/mrat-sql-65a671743bb0e70b416561f6), you can download the model's checkpoints and datasets, but to understand is better to go to Github [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
# mRAT-SQL-FIT
## A Multilingual Translator to SQL with Database Schema Pruning to Improve Self-Attention
Marcelo Archanjo Jose, Fabio Gagliardi Cozman
Long sequences of text are challenging in the context of transformers, due to quadratic memory increase in the self-attention mechanism. As this issue directly affects the translation from natural language to SQL queries (as techniques usually take as input a concatenated text with the question and the database schema), we present techniques that allow long text sequences to be handled by transformers with up to 512 input tokens. We propose a training process with database schema pruning (removal of tables and columns names that are useless for the query of interest). In addition, we used a multilingual approach with the mT5-large model fine-tuned with a data-augmented Spider dataset in four languages simultaneously: English, Portuguese, Spanish, and French. Our proposed technique used the Spider dataset and increased the exact set match accuracy results from 0.718 to 0.736 in a validation dataset (Dev). Source code, evaluations, and checkpoints are available at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
[paper published in Springer-Nature - International Journal of Information Technology](https://doi.org/10.1007/s41870-023-01342-3), [here the SharedIt link](https://rdcu.be/dff19). [here the pre-print in arXiv](https://arxiv.org/abs/2306.14256).
# mRAT-SQL+GAP
## mRAT-SQL+GAP:A Portuguese Text-to-SQL Transformer
Marcelo Archanjo José, Fabio Gagliardi Cozman
The translation of natural language questions to SQL queries has attracted growing attention, in particular in connection with transformers and similar language models. A large number of techniques are geared towards the English language; in this work, we thus investigated translation to SQL when input questions are given in the Portuguese language. To do so, we properly adapted state-of-the-art tools and resources. We changed the RAT-SQL+GAP system by relying on a multilingual BART model (we report tests with other language models), and we produced a translated version of the Spider dataset. Our experiments expose interesting phenomena that arise when non-English languages are targeted; in particular, it is better to train with original and translated training datasets together, even if a single target language is desired. This multilingual BART model fine-tuned with a double-size training dataset (English and Portuguese) achieved 83% of the baseline, making inferences for the Portuguese test dataset. This investigation can help other researchers to produce results in Machine Learning in a language different from English. Our multilingual ready version of RAT-SQL+GAP and the data are available, open-sourced as mRAT-SQL+GAP at: [mRAT-SQL](https://github.com/C4AI/gap-text2sql).
BRACIS 2021: [paper published in Springer Lecture Notes in Computer Science](https://link.springer.com/chapter/10.1007%2F978-3-030-91699-2_35), [here the pre-print in arXiv](https://arxiv.org/abs/2110.03546).
Based on: RAT-SQL+GAP: [Github](https://github.com/awslabs/gap-text2sql). Paper: [AAAI 2021 paper](https://arxiv.org/abs/2012.10309) |
irds/msmarco-document_trec-dl-hard | ---
pretty_name: '`msmarco-document/trec-dl-hard`'
viewer: false
source_datasets: ['irds/msmarco-document']
task_categories:
- text-retrieval
---
# Dataset Card for `msmarco-document/trec-dl-hard`
The `msmarco-document/trec-dl-hard` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/msmarco-document#msmarco-document/trec-dl-hard).
# Data
This dataset provides:
- `queries` (i.e., topics); count=50
- `qrels`: (relevance assessments); count=8,544
- For `docs`, use [`irds/msmarco-document`](https://huggingface.co/datasets/irds/msmarco-document)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/msmarco-document_trec-dl-hard', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/msmarco-document_trec-dl-hard', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Mackie2021DlHard,
title={How Deep is your Learning: the DL-HARD Annotated Deep Learning Dataset},
author={Iain Mackie and Jeffrey Dalton and Andrew Yates},
journal={ArXiv},
year={2021},
volume={abs/2105.07975}
}
@inproceedings{Bajaj2016Msmarco,
title={MS MARCO: A Human Generated MAchine Reading COmprehension Dataset},
author={Payal Bajaj, Daniel Campos, Nick Craswell, Li Deng, Jianfeng Gao, Xiaodong Liu, Rangan Majumder, Andrew McNamara, Bhaskar Mitra, Tri Nguyen, Mir Rosenberg, Xia Song, Alina Stoica, Saurabh Tiwary, Tong Wang},
booktitle={InCoCo@NIPS},
year={2016}
}
```
|
open-llm-leaderboard/details_synapsoft__Llama-2-7b-chat-hf-flan2022-1.2M | ---
pretty_name: Evaluation run of synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M](https://huggingface.co/synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_synapsoft__Llama-2-7b-chat-hf-flan2022-1.2M\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T08:39:00.771555](https://huggingface.co/datasets/open-llm-leaderboard/details_synapsoft__Llama-2-7b-chat-hf-flan2022-1.2M/blob/main/results_2023-09-23T08-39-00.771555.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2627936241610738,\n\
\ \"em_stderr\": 0.004507560917898865,\n \"f1\": 0.30115981543624176,\n\
\ \"f1_stderr\": 0.004494140287139199,\n \"acc\": 0.3666975232366727,\n\
\ \"acc_stderr\": 0.008004674480789642\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2627936241610738,\n \"em_stderr\": 0.004507560917898865,\n\
\ \"f1\": 0.30115981543624176,\n \"f1_stderr\": 0.004494140287139199\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.015163002274450341,\n \
\ \"acc_stderr\": 0.003366022949726345\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7182320441988951,\n \"acc_stderr\": 0.01264332601185294\n\
\ }\n}\n```"
repo_url: https://huggingface.co/synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|arc:challenge|25_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T08_39_00.771555
path:
- '**/details_harness|drop|3_2023-09-23T08-39-00.771555.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T08-39-00.771555.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T08_39_00.771555
path:
- '**/details_harness|gsm8k|5_2023-09-23T08-39-00.771555.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T08-39-00.771555.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hellaswag|10_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T22:45:47.858606.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-04T22:45:47.858606.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-04T22:45:47.858606.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T08_39_00.771555
path:
- '**/details_harness|winogrande|5_2023-09-23T08-39-00.771555.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T08-39-00.771555.parquet'
- config_name: results
data_files:
- split: 2023_09_04T22_45_47.858606
path:
- results_2023-09-04T22:45:47.858606.parquet
- split: 2023_09_23T08_39_00.771555
path:
- results_2023-09-23T08-39-00.771555.parquet
- split: latest
path:
- results_2023-09-23T08-39-00.771555.parquet
---
# Dataset Card for Evaluation run of synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M](https://huggingface.co/synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_synapsoft__Llama-2-7b-chat-hf-flan2022-1.2M",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T08:39:00.771555](https://huggingface.co/datasets/open-llm-leaderboard/details_synapsoft__Llama-2-7b-chat-hf-flan2022-1.2M/blob/main/results_2023-09-23T08-39-00.771555.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2627936241610738,
"em_stderr": 0.004507560917898865,
"f1": 0.30115981543624176,
"f1_stderr": 0.004494140287139199,
"acc": 0.3666975232366727,
"acc_stderr": 0.008004674480789642
},
"harness|drop|3": {
"em": 0.2627936241610738,
"em_stderr": 0.004507560917898865,
"f1": 0.30115981543624176,
"f1_stderr": 0.004494140287139199
},
"harness|gsm8k|5": {
"acc": 0.015163002274450341,
"acc_stderr": 0.003366022949726345
},
"harness|winogrande|5": {
"acc": 0.7182320441988951,
"acc_stderr": 0.01264332601185294
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_1.4b_bo2_100_kl_0.1_prm_410m_thr_0.3_seed_3 | ---
dataset_info:
config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: index
dtype: int64
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43628755
num_examples: 18929
- name: epoch_1
num_bytes: 43855867
num_examples: 18929
- name: epoch_2
num_bytes: 43826818
num_examples: 18929
- name: epoch_3
num_bytes: 43792384
num_examples: 18929
- name: epoch_4
num_bytes: 43753256
num_examples: 18929
- name: epoch_5
num_bytes: 43735446
num_examples: 18929
- name: epoch_6
num_bytes: 43728919
num_examples: 18929
- name: epoch_7
num_bytes: 43717076
num_examples: 18929
- name: epoch_8
num_bytes: 43713216
num_examples: 18929
- name: epoch_9
num_bytes: 43699028
num_examples: 18929
download_size: 302049802
dataset_size: 437450765
configs:
- config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
data_files:
- split: epoch_0
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_9-*
---
|
espidermon/babar-azam | ---
license: openrail
---
|
liuyanchen1015/MULTI_VALUE_sst2_bare_past_tense | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 9496
num_examples: 62
- name: test
num_bytes: 19296
num_examples: 133
- name: train
num_bytes: 288560
num_examples: 2585
download_size: 170639
dataset_size: 317352
---
# Dataset Card for "MULTI_VALUE_sst2_bare_past_tense"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_deepseek-ai__deepseek-math-7b-base | ---
pretty_name: Evaluation run of deepseek-ai/deepseek-math-7b-base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [deepseek-ai/deepseek-math-7b-base](https://huggingface.co/deepseek-ai/deepseek-math-7b-base)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_deepseek-ai__deepseek-math-7b-base\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-15T10:41:41.444832](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-math-7b-base/blob/main/results_2024-03-15T10-41-41.444832.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5720441323525545,\n\
\ \"acc_stderr\": 0.0345237727785387,\n \"acc_norm\": 0.5737206040031878,\n\
\ \"acc_norm_stderr\": 0.0352334023126239,\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.01541524174023702,\n \"mc2\": 0.4071269130958089,\n\
\ \"mc2_stderr\": 0.01426178868135068\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.48976109215017066,\n \"acc_stderr\": 0.014608326906285019,\n\
\ \"acc_norm\": 0.5221843003412969,\n \"acc_norm_stderr\": 0.014597001927076136\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5126468830910177,\n\
\ \"acc_stderr\": 0.00498818498834529,\n \"acc_norm\": 0.6948814977096196,\n\
\ \"acc_norm_stderr\": 0.004595165551383618\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.042561937679014075,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.042561937679014075\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395269,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395269\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5735849056603773,\n \"acc_stderr\": 0.030437794342983045,\n\
\ \"acc_norm\": 0.5735849056603773,\n \"acc_norm_stderr\": 0.030437794342983045\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.03765746693865151,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.03765746693865151\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6595744680851063,\n \"acc_stderr\": 0.030976692998534422,\n\
\ \"acc_norm\": 0.6595744680851063,\n \"acc_norm_stderr\": 0.030976692998534422\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.0397923663749741,\n\
\ \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.0397923663749741\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5634920634920635,\n \"acc_stderr\": 0.025542846817400492,\n \"\
acc_norm\": 0.5634920634920635,\n \"acc_norm_stderr\": 0.025542846817400492\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.667741935483871,\n\
\ \"acc_stderr\": 0.026795560848122794,\n \"acc_norm\": 0.667741935483871,\n\
\ \"acc_norm_stderr\": 0.026795560848122794\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5763546798029556,\n \"acc_stderr\": 0.03476725747649037,\n\
\ \"acc_norm\": 0.5763546798029556,\n \"acc_norm_stderr\": 0.03476725747649037\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6919191919191919,\n \"acc_stderr\": 0.03289477330098616,\n \"\
acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.03289477330098616\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6632124352331606,\n \"acc_stderr\": 0.03410780251836184,\n\
\ \"acc_norm\": 0.6632124352331606,\n \"acc_norm_stderr\": 0.03410780251836184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n\
\ \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.43333333333333335,\n \"acc_stderr\": 0.030213340289237927,\n \
\ \"acc_norm\": 0.43333333333333335,\n \"acc_norm_stderr\": 0.030213340289237927\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4304635761589404,\n \"acc_stderr\": 0.04042809961395634,\n \"\
acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.04042809961395634\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7247706422018348,\n \"acc_stderr\": 0.019149093743155203,\n \"\
acc_norm\": 0.7247706422018348,\n \"acc_norm_stderr\": 0.019149093743155203\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5637254901960784,\n\
\ \"acc_stderr\": 0.03480693138457039,\n \"acc_norm\": 0.5637254901960784,\n\
\ \"acc_norm_stderr\": 0.03480693138457039\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.6286919831223629,\n \"acc_stderr\": 0.031450686007448596,\n\
\ \"acc_norm\": 0.6286919831223629,\n \"acc_norm_stderr\": 0.031450686007448596\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5919282511210763,\n\
\ \"acc_stderr\": 0.03298574607842822,\n \"acc_norm\": 0.5919282511210763,\n\
\ \"acc_norm_stderr\": 0.03298574607842822\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6776859504132231,\n \"acc_stderr\": 0.04266416363352168,\n \"\
acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.04266416363352168\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.046561471100123514,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.046561471100123514\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335445,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335445\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6768837803320562,\n\
\ \"acc_stderr\": 0.016723726512343048,\n \"acc_norm\": 0.6768837803320562,\n\
\ \"acc_norm_stderr\": 0.016723726512343048\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5924855491329479,\n \"acc_stderr\": 0.0264545781469315,\n\
\ \"acc_norm\": 0.5924855491329479,\n \"acc_norm_stderr\": 0.0264545781469315\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n\
\ \"acc_stderr\": 0.014874252168095277,\n \"acc_norm\": 0.27150837988826815,\n\
\ \"acc_norm_stderr\": 0.014874252168095277\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5751633986928104,\n \"acc_stderr\": 0.028304576673141103,\n\
\ \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.028304576673141103\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6109324758842444,\n\
\ \"acc_stderr\": 0.027690337536485376,\n \"acc_norm\": 0.6109324758842444,\n\
\ \"acc_norm_stderr\": 0.027690337536485376\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5524691358024691,\n \"acc_stderr\": 0.02766713856942271,\n\
\ \"acc_norm\": 0.5524691358024691,\n \"acc_norm_stderr\": 0.02766713856942271\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.379400260756193,\n\
\ \"acc_stderr\": 0.012393202029825398,\n \"acc_norm\": 0.379400260756193,\n\
\ \"acc_norm_stderr\": 0.012393202029825398\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4007352941176471,\n \"acc_stderr\": 0.02976826352893311,\n\
\ \"acc_norm\": 0.4007352941176471,\n \"acc_norm_stderr\": 0.02976826352893311\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5277777777777778,\n \"acc_stderr\": 0.020196594933541197,\n \
\ \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.020196594933541197\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.030713560455108493,\n\
\ \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.030713560455108493\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.03819486140758397,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.03819486140758397\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.03762738699917057,\n\
\ \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.03762738699917057\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.01541524174023702,\n \"mc2\": 0.4071269130958089,\n\
\ \"mc2_stderr\": 0.01426178868135068\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6677190213101816,\n \"acc_stderr\": 0.013238316554236521\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5921152388172858,\n \
\ \"acc_stderr\": 0.013536742075643088\n }\n}\n```"
repo_url: https://huggingface.co/deepseek-ai/deepseek-math-7b-base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|arc:challenge|25_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|gsm8k|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hellaswag|10_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T10-41-41.444832.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-15T10-41-41.444832.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- '**/details_harness|winogrande|5_2024-03-15T10-41-41.444832.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-15T10-41-41.444832.parquet'
- config_name: results
data_files:
- split: 2024_03_15T10_41_41.444832
path:
- results_2024-03-15T10-41-41.444832.parquet
- split: latest
path:
- results_2024-03-15T10-41-41.444832.parquet
---
# Dataset Card for Evaluation run of deepseek-ai/deepseek-math-7b-base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [deepseek-ai/deepseek-math-7b-base](https://huggingface.co/deepseek-ai/deepseek-math-7b-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_deepseek-ai__deepseek-math-7b-base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-15T10:41:41.444832](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-math-7b-base/blob/main/results_2024-03-15T10-41-41.444832.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5720441323525545,
"acc_stderr": 0.0345237727785387,
"acc_norm": 0.5737206040031878,
"acc_norm_stderr": 0.0352334023126239,
"mc1": 0.2631578947368421,
"mc1_stderr": 0.01541524174023702,
"mc2": 0.4071269130958089,
"mc2_stderr": 0.01426178868135068
},
"harness|arc:challenge|25": {
"acc": 0.48976109215017066,
"acc_stderr": 0.014608326906285019,
"acc_norm": 0.5221843003412969,
"acc_norm_stderr": 0.014597001927076136
},
"harness|hellaswag|10": {
"acc": 0.5126468830910177,
"acc_stderr": 0.00498818498834529,
"acc_norm": 0.6948814977096196,
"acc_norm_stderr": 0.004595165551383618
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.042561937679014075,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.042561937679014075
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395269,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395269
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5735849056603773,
"acc_stderr": 0.030437794342983045,
"acc_norm": 0.5735849056603773,
"acc_norm_stderr": 0.030437794342983045
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.03765746693865151,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.03765746693865151
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6595744680851063,
"acc_stderr": 0.030976692998534422,
"acc_norm": 0.6595744680851063,
"acc_norm_stderr": 0.030976692998534422
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.025542846817400492,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.025542846817400492
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.667741935483871,
"acc_stderr": 0.026795560848122794,
"acc_norm": 0.667741935483871,
"acc_norm_stderr": 0.026795560848122794
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5763546798029556,
"acc_stderr": 0.03476725747649037,
"acc_norm": 0.5763546798029556,
"acc_norm_stderr": 0.03476725747649037
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.03289477330098616,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.03289477330098616
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6632124352331606,
"acc_stderr": 0.03410780251836184,
"acc_norm": 0.6632124352331606,
"acc_norm_stderr": 0.03410780251836184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.43333333333333335,
"acc_stderr": 0.030213340289237927,
"acc_norm": 0.43333333333333335,
"acc_norm_stderr": 0.030213340289237927
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.04042809961395634,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.04042809961395634
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7247706422018348,
"acc_stderr": 0.019149093743155203,
"acc_norm": 0.7247706422018348,
"acc_norm_stderr": 0.019149093743155203
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5637254901960784,
"acc_stderr": 0.03480693138457039,
"acc_norm": 0.5637254901960784,
"acc_norm_stderr": 0.03480693138457039
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6286919831223629,
"acc_stderr": 0.031450686007448596,
"acc_norm": 0.6286919831223629,
"acc_norm_stderr": 0.031450686007448596
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5919282511210763,
"acc_stderr": 0.03298574607842822,
"acc_norm": 0.5919282511210763,
"acc_norm_stderr": 0.03298574607842822
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.04266416363352168,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.04266416363352168
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.046561471100123514,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.046561471100123514
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335445,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335445
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6768837803320562,
"acc_stderr": 0.016723726512343048,
"acc_norm": 0.6768837803320562,
"acc_norm_stderr": 0.016723726512343048
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5924855491329479,
"acc_stderr": 0.0264545781469315,
"acc_norm": 0.5924855491329479,
"acc_norm_stderr": 0.0264545781469315
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.014874252168095277,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.014874252168095277
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.028304576673141103,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.028304576673141103
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6109324758842444,
"acc_stderr": 0.027690337536485376,
"acc_norm": 0.6109324758842444,
"acc_norm_stderr": 0.027690337536485376
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5524691358024691,
"acc_stderr": 0.02766713856942271,
"acc_norm": 0.5524691358024691,
"acc_norm_stderr": 0.02766713856942271
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.379400260756193,
"acc_stderr": 0.012393202029825398,
"acc_norm": 0.379400260756193,
"acc_norm_stderr": 0.012393202029825398
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4007352941176471,
"acc_stderr": 0.02976826352893311,
"acc_norm": 0.4007352941176471,
"acc_norm_stderr": 0.02976826352893311
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.020196594933541197,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.020196594933541197
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.03819486140758397,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.03819486140758397
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.03762738699917057,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.03762738699917057
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2631578947368421,
"mc1_stderr": 0.01541524174023702,
"mc2": 0.4071269130958089,
"mc2_stderr": 0.01426178868135068
},
"harness|winogrande|5": {
"acc": 0.6677190213101816,
"acc_stderr": 0.013238316554236521
},
"harness|gsm8k|5": {
"acc": 0.5921152388172858,
"acc_stderr": 0.013536742075643088
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
adalib/whylogs-data | ---
dataset_info:
features:
- name: code
dtype: string
- name: apis
sequence: string
- name: extract_api
dtype: string
splits:
- name: train
num_bytes: 1356874
num_examples: 36
- name: test
num_bytes: 56653
num_examples: 10
download_size: 287434
dataset_size: 1413527
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
arieg/bw_spec_cls_80_16 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '43534'
'1': '43535'
'2': '43536'
'3': '43585'
'4': '43586'
'5': '43587'
'6': '43588'
'7': '43589'
'8': '43590'
'9': '43592'
'10': '43593'
'11': '43594'
'12': '43595'
'13': '43596'
'14': '43598'
'15': '43599'
'16': '43600'
'17': '43608'
'18': '43621'
'19': '43623'
'20': '43695'
'21': '43696'
'22': '43697'
'23': '43698'
'24': '43699'
'25': '43761'
'26': '43773'
'27': '43796'
'28': '43842'
'29': '43843'
'30': '43844'
'31': '43856'
'32': '43857'
'33': '43858'
'34': '43860'
'35': '43861'
'36': '43863'
'37': '43865'
'38': '43866'
'39': '43867'
'40': '43868'
'41': '43869'
'42': '43883'
'43': '43886'
'44': '43899'
'45': '43911'
'46': '43962'
'47': '43965'
'48': '44092'
'49': '44110'
'50': '44169'
'51': '44236'
'52': '44342'
'53': '44347'
'54': '44354'
'55': '44778'
'56': '44779'
'57': '44780'
'58': '44781'
'59': '44782'
'60': '44791'
'61': '44792'
'62': '44793'
'63': '44794'
'64': '44795'
'65': '44796'
'66': '44797'
'67': '44798'
'68': '44799'
'69': '44801'
'70': '44803'
'71': '44804'
'72': '44805'
'73': '44806'
'74': '44809'
'75': '44820'
'76': '44821'
'77': '44822'
'78': '44823'
'79': '44848'
splits:
- name: train
num_bytes: 90417910.4
num_examples: 1600
download_size: 89917143
dataset_size: 90417910.4
---
# Dataset Card for "bw_spec_cls_80_16"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lucadiliello/bioasqqa | ---
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence: string
- name: key
dtype: string
- name: labels
list:
- name: end
sequence: int64
- name: start
sequence: int64
splits:
- name: test
num_bytes: 2478570
num_examples: 1504
download_size: 1270845
dataset_size: 2478570
---
# Dataset Card for "bioasqqa"
Split taken from the MRQA 2019 Shared Task, formatted and filtered for Question Answering. For the original dataset, have a look [here](https://huggingface.co/datasets/mrqa). |
qwopqwop/ALMA-R-ko-en | ---
language:
- ko
- en
license: cc-by-sa-4.0
size_categories:
- 1K<n<10K
task_categories:
- translation
dataset_info:
config_name: ko-en
features:
- name: translation
struct:
- name: Delta
dtype: int64
- name: alma_en
dtype: string
- name: alma_en_kiwi
dtype: float64
- name: alma_en_kiwi_xcomet
dtype: float64
- name: alma_en_xcomet
dtype: float64
- name: alma_ko
dtype: string
- name: alma_ko_kiwi
dtype: float64
- name: alma_ko_kiwi_xcomet
dtype: float64
- name: alma_ko_xcomet
dtype: float64
- name: en
dtype: string
- name: gpt4_en
dtype: string
- name: gpt4_en_kiwi
dtype: float64
- name: gpt4_en_kiwi_xcomet
dtype: float64
- name: gpt4_en_xcomet
dtype: float64
- name: gpt4_ko
dtype: string
- name: gpt4_ko_kiwi
dtype: float64
- name: gpt4_ko_kiwi_xcomet
dtype: float64
- name: gpt4_ko_xcomet
dtype: float64
- name: ko
dtype: string
- name: language_pair
dtype: string
- name: ref_en_kiwi
dtype: float64
- name: ref_en_kiwi_xcomet
dtype: float64
- name: ref_en_xcomet
dtype: float64
- name: ref_ko_kiwi
dtype: float64
- name: ref_ko_kiwi_xcomet
dtype: float64
- name: ref_ko_xcomet
dtype: float64
- name: required_directions
dtype: string
splits:
- name: train
num_bytes: 2066513
num_examples: 2009
download_size: 1399967
dataset_size: 2066513
configs:
- config_name: ko-en
data_files:
- split: train
path: ko-en/train-*
---
# Dataset Card for "ALMA-R-ko-en-Preference"
ref) https://huggingface.co/datasets/haoranxu/ALMA-R-Preference
The triplet prference data, supporting 2 translation directions, is built upon the FLORES-200 development and test data. For each direction, we provide a source sentence along with three translations: one from GPT-4, another from EEVE-ALMA-LoRA, and a reference translation. For instance, in the English-German pair, our data structure is as follows:
### Sentences:
- ko: Original Korean sentence
- en: Original English sentence
- alma_ko: Korean sentence translated from English by ALMA
- gpt4_ko: Korean sentence translated from English by GPT-4
- alma_en: English sentence translated from Korean by ALMA
- gpt4_en: English sentence translated from Korean by GPT-4
### Scores
- alma_en_${Score}: ${Score} of English sentence translated by ALMA
- gpt4_en_${Score}: ${Score} of English sentence translated by GPT4
- ref_en_${Score}: ${Score} of reference English sentence
- alma_ko_${Score}: ${Score} of Korean sentence translated by ALMA
- gpt4_ko_${Sscore}: ${Score} of Korean sentence translated by GPT4
- ref_ko_${Score}: ${Score} of reference Korean sentence
${Score} can be numbers from kiwi ([wmt23-cometkiwi-da-xxl](https://huggingface.co/Unbabel/wmt23-cometkiwi-da-xxl)), xcomet ([XCOMET-XXL](https://huggingface.co/Unbabel/XCOMET-XXL)),
or kiwi_xcomet (average score of kiwi and xcomet).
### Others
- Delta: A value of 0 indicates non-human annotated data or tied evaluations. A postive number suggests that alma_ko is better than gpt4_ko, vice versa
- required_directions: An empty field implies that this data point can be used for both translation directions. If the string 'en-ko' is specified, it indicates that this data point is exclusively for English to Korean translation |
open-llm-leaderboard/details_Weyaxi__openchat-3.5-1210-Seraph-Slerp | ---
pretty_name: Evaluation run of Weyaxi/openchat-3.5-1210-Seraph-Slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/openchat-3.5-1210-Seraph-Slerp](https://huggingface.co/Weyaxi/openchat-3.5-1210-Seraph-Slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__openchat-3.5-1210-Seraph-Slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-08T05:17:56.550052](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__openchat-3.5-1210-Seraph-Slerp/blob/main/results_2024-01-08T05-17-56.550052.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6564663991045154,\n\
\ \"acc_stderr\": 0.031986585336666803,\n \"acc_norm\": 0.6566440007717916,\n\
\ \"acc_norm_stderr\": 0.03264682157479926,\n \"mc1\": 0.3990208078335373,\n\
\ \"mc1_stderr\": 0.017142825728496763,\n \"mc2\": 0.5774988351776751,\n\
\ \"mc2_stderr\": 0.015172641642340482\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.64419795221843,\n \"acc_stderr\": 0.013990571137918762,\n\
\ \"acc_norm\": 0.6791808873720137,\n \"acc_norm_stderr\": 0.013640943091946531\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6709818761202948,\n\
\ \"acc_stderr\": 0.004688963175758129,\n \"acc_norm\": 0.8642700657239594,\n\
\ \"acc_norm_stderr\": 0.003418015843918828\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859375,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859375\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726854,\n \"\
acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726854\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6871794871794872,\n \"acc_stderr\": 0.023507579020645365,\n\
\ \"acc_norm\": 0.6871794871794872,\n \"acc_norm_stderr\": 0.023507579020645365\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240634,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240634\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944867,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944867\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098822,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098822\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.04738975119274155,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.04738975119274155\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n\
\ \"acc_stderr\": 0.013223928616741622,\n \"acc_norm\": 0.8365261813537676,\n\
\ \"acc_norm_stderr\": 0.013223928616741622\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3854748603351955,\n\
\ \"acc_stderr\": 0.016277927039638193,\n \"acc_norm\": 0.3854748603351955,\n\
\ \"acc_norm_stderr\": 0.016277927039638193\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.025403832978179604,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.025403832978179604\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135107,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135107\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n\
\ \"acc_stderr\": 0.012747248967079067,\n \"acc_norm\": 0.470013037809648,\n\
\ \"acc_norm_stderr\": 0.012747248967079067\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.02411267824090083,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.02411267824090083\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061452,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061452\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3990208078335373,\n\
\ \"mc1_stderr\": 0.017142825728496763,\n \"mc2\": 0.5774988351776751,\n\
\ \"mc2_stderr\": 0.015172641642340482\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8082083662194159,\n \"acc_stderr\": 0.011065209664659527\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7225170583775588,\n \
\ \"acc_stderr\": 0.012333447581047537\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/openchat-3.5-1210-Seraph-Slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|arc:challenge|25_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|arc:challenge|25_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|gsm8k|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|gsm8k|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hellaswag|10_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hellaswag|10_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T15-59-25.181262.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T05-17-56.550052.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-08T05-17-56.550052.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- '**/details_harness|winogrande|5_2023-12-29T15-59-25.181262.parquet'
- split: 2024_01_08T05_17_56.550052
path:
- '**/details_harness|winogrande|5_2024-01-08T05-17-56.550052.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-08T05-17-56.550052.parquet'
- config_name: results
data_files:
- split: 2023_12_29T15_59_25.181262
path:
- results_2023-12-29T15-59-25.181262.parquet
- split: 2024_01_08T05_17_56.550052
path:
- results_2024-01-08T05-17-56.550052.parquet
- split: latest
path:
- results_2024-01-08T05-17-56.550052.parquet
---
# Dataset Card for Evaluation run of Weyaxi/openchat-3.5-1210-Seraph-Slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/openchat-3.5-1210-Seraph-Slerp](https://huggingface.co/Weyaxi/openchat-3.5-1210-Seraph-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__openchat-3.5-1210-Seraph-Slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-08T05:17:56.550052](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__openchat-3.5-1210-Seraph-Slerp/blob/main/results_2024-01-08T05-17-56.550052.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6564663991045154,
"acc_stderr": 0.031986585336666803,
"acc_norm": 0.6566440007717916,
"acc_norm_stderr": 0.03264682157479926,
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496763,
"mc2": 0.5774988351776751,
"mc2_stderr": 0.015172641642340482
},
"harness|arc:challenge|25": {
"acc": 0.64419795221843,
"acc_stderr": 0.013990571137918762,
"acc_norm": 0.6791808873720137,
"acc_norm_stderr": 0.013640943091946531
},
"harness|hellaswag|10": {
"acc": 0.6709818761202948,
"acc_stderr": 0.004688963175758129,
"acc_norm": 0.8642700657239594,
"acc_norm_stderr": 0.003418015843918828
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859375,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859375
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6871794871794872,
"acc_stderr": 0.023507579020645365,
"acc_norm": 0.6871794871794872,
"acc_norm_stderr": 0.023507579020645365
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240634,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240634
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944867,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944867
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098822,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098822
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.04738975119274155,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.04738975119274155
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.013223928616741622,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.013223928616741622
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3854748603351955,
"acc_stderr": 0.016277927039638193,
"acc_norm": 0.3854748603351955,
"acc_norm_stderr": 0.016277927039638193
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179604,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179604
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135107,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079067,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079067
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.02411267824090083,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.02411267824090083
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061452,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061452
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496763,
"mc2": 0.5774988351776751,
"mc2_stderr": 0.015172641642340482
},
"harness|winogrande|5": {
"acc": 0.8082083662194159,
"acc_stderr": 0.011065209664659527
},
"harness|gsm8k|5": {
"acc": 0.7225170583775588,
"acc_stderr": 0.012333447581047537
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Yang1926/STT_Jargon | ---
task_categories:
- automatic-speech-recognition
language:
- en
tags:
- finance
pretty_name: Jargon
size_categories:
- n<1K
--- |
heliosprime/twitter_dataset_1713211788 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 22430
num_examples: 60
download_size: 20563
dataset_size: 22430
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713211788"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JWBickel/BibleDictionaries | ---
language:
- en
configs:
- config_name: default
data_files:
- split: train
path:
- "Easton's Bible Dictionary.jsonl"
- "Hitchcock's Bible Names Dictionary.jsonl"
- "Smith's Bible Dictionary.jsonl"
- "TorreysTopicalTextbook.jsonl"
- config_name: Easton
data_files:
- split: train
path: "Easton's Bible Dictionary.jsonl"
- config_name: Hitchcock
data_files:
- split: train
path: "Hitchcock's Bible Names Dictionary.jsonl"
- config_name: Smith
data_files:
- split: train
path: "Smith's Bible Dictionary.jsonl"
- config_name: Torrey
data_files:
- split: train
path: "TorreysTopicalTextbook.jsonl"
size_categories:
- 10K<n<100K
---
JSON for:
-
Easton's Bible Dictionary
Smith's Bible Dictionary
Hitchcock's Bible Names Dictionary
Torry's Topical Handbook |
NobodyExistsOnTheInternet/expSharePippa | ---
license: mit
---
|
yasminesarraj/texts_summary | ---
license: openrail
---
|
nielsr/ade20k-panoptic-demo | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
- name: segments_info
list:
- name: area
dtype: int64
- name: bbox
sequence: int64
- name: category_id
dtype: int64
- name: id
dtype: int64
- name: iscrowd
dtype: int64
splits:
- name: train
num_bytes: 492746.0
num_examples: 10
- name: validation
num_bytes: 461402.0
num_examples: 10
download_size: 949392
dataset_size: 954148.0
---
# Dataset Card for "ade20k-panoptic-demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gate369/Dynamic-Neural-Architecture-Optimization | ---
license: other
license_name: paper
license_link: LICENSE
---
Dynamic Neural Architecture Optimization (DNAO)
I
Title: Dynamic Neural Architecture Optimization through Adaptive Meta-Learning for Enhanced AI Efficiency
Abstract:
In this paper, I propose a novel concept called "Dynamic Neural Architecture Optimization (DNAO) through Adaptive Meta-Learning," aimed at enhancing the efficiency
and accuracy of artificial intelligence systems. By integrating a self-evolving neural network architecture that adapts in real-time to specific problem requirements
with a meta-learning component capable of learning from past experiences, our approach can optimize performance while reducing computational costs. I'll try my best to
outline the various steps involved in developing an AI model based on this concept and discuss potential libraries, resources, and techniques useful for its implementation.
1. Initial Training:
This phase focuses on training a base model using various tasks or problems to establish an initial understanding of different neural network architectures'
effectiveness across different domains. The goal is to gather diverse experience that will serve as the foundation for meta-learning.
- Data collection and preprocessing: Gather datasets for various tasks (e.g., image recognition, NLP, speech recognition, time series analysis) and prepare the data by
normalizing, augmenting, and splitting it into training/validation/testing sets as needed. Libraries such as NumPy, pandas, and scikit-learn can help with data
manipulation and preprocessing tasks.
- Neural network architectures: Experiment with various neural network designs (e.g., Convolutional Neural Networks for image recognition or Recurrent Neural Networks
for time series analysis). Deep learning libraries like TensorFlow, PyTorch, or Keras can provide a wide range of prebuilt modules to create and train these models.
- Training loop setup: Implement a standard training loop that includes data loading, model initialization, optimization algorithm selection (e.g., Adam), and model
evaluation on the validation set using metrics like accuracy, loss, and AUC. Libraries like TensorFlow, PyTorch, or Keras offer built-in APIs for these tasks.
- Model storage: Store trained models in a format that can be easily retrieved later for meta-learning. The popular formats include HDF5 (using h5py library)
or JSON (with the json module).
Steps to take:
- Data collection and preprocessing:
* Gather datasets for various tasks (e.g., CIFAR-10 for image recognition, IMDB or AG News for NLP, TIDIGITS for speech recognition, or ECG5000 for time series analysis)
* Normalize the data if necessary using libraries like NumPy or scikit-learn
* Augment the data (if needed) to improve model generalization
* Split the dataset into training, validation, and testing sets
- Neural network architectures:
* Choose appropriate models based on the task type: Convolutional Neural Networks for image recognition (e.g., VGG, ResNet), Recurrent Neural Networks for
sequence data processing (e.g., LSTM, GRU), Transformers for NLP tasks (BERT, GPT-2/3), or Feedforward networks for speech and time series analysis
- Training loop setup:
* Initialize the chosen neural network model using a library like TensorFlow, PyTorch, or Keras
* Define a loss function (e.g., cross-entropy for classification tasks) and an optimizer algorithm (Adam, SGD)
* Create a training loop with forward propagation, backpropagation, and weight update steps
* Evaluate the model's performance on validation data after each epoch using metrics like accuracy, loss, and AUC
* Store the trained models in an appropriate format for future use (e.g., HDF5 or JSON)
2. Meta-Learning Phase: Here, we aim to develop a meta-learner that can observe and learn from the base model's performance during its training process to gain insights
into effective neural network designs, their strengths and weaknesses, and the factors influencing efficiency.
- Observe the base model: Track the base model's performance on various tasks at different stages of its training. Collect relevant metrics like accuracy,
loss function values, training time, and resource utilization to provide the meta-learner with a comprehensive understanding of the base model's learning
process and efficiency.
- Develop the meta-learner: Implement machine learning or deep learning algorithms to analyze and learn from the collected data. This learner could use techniques
like reinforcement learning, supervised learning, or unsupervised learning depending on the available data and desired outcomes.
Steps to take:
- Data collection for meta-learning: Collect performance metrics from the base models' training process, including accuracy, loss function values, training time,
and resource utilization. These data can be stored in a separate file or directly appended to the model checkpoint file. Libraries like NumPy and pandas can help
manage this data efficiently.
- Meta-learner design: Choose an appropriate machine learning algorithm (e.g., reinforcement learning with Proximal Policy Optimization, supervised learning with a
regression model, or unsupervised learning with autoencoders) to learn from the meta-data collected during base model training. Libraries like TensorFlow, PyTorch,
scikit-learn, and OpenAI Gym can provide support for different machine learning algorithms.
- Hyperparameter optimization: Fine-tune hyperparameters for both the base model's training loop and the meta-learner using techniques such as grid search or Bayesian
optimization. Libraries like scikit-opt, OptUNE, and Hyperopt can help optimize hyperparameters effectively.
- Meta-learning evaluation: Assess the performance of the meta-learner by testing it on new base models trained on different tasks and datasets. Compare its predictions
against ground truth (e.g., optimal architectures for specific problems) to evaluate its learning capabilities accurately.
3. Adaptive Architecture Generation: Based on the insights gained through meta-learning, develop an algorithm that generates customized neural network architectures
tailored to specific tasks or datasets. These architectures should be optimized for both accuracy and efficiency in a manner that dynamically adapts to new information.
- Architecture design space exploration: Generate a diverse set of possible neural network designs using different building blocks (e.g., convolutional layers, pooling
layers, recurrent layers, etc.) and connectivity patterns. These designs could range from simple to complex architectures depending on the problem's complexity and
available computational resources.
- Meta-learning-guided architecture selection: Use the insights gained from meta-learning to evaluate and rank these potential architectures based on factors like
historical performance, resource efficiency, and problem-specific features (e.g., spatial relationships for image tasks or temporal dependencies for time series
analysis).
- Adaptive architecture optimization: Apply genetic algorithms, gradient-based optimization methods, or other search techniques to refine the selected architectures
further in terms of both accuracy and resource utilization.
Steps to take:
- Architecture exploration: Implement a method to generate a diverse set of potential neural network designs based on different building blocks and connectivity patterns.
Libraries like TensorFlow or PyTorch provide useful modules (e.g., layers, optimizers) for constructing these architectures.
- Meta-learner integration: Integrate the meta-learner's insights into the architecture exploration process to rank and select candidate architectures based on their
potential performance in specific tasks or datasets. This could involve using machine learning models like Random Forests or Support Vector Machines for ranking.
- Architecture optimization: Fine-tune the selected architectures using techniques like gradient descent, genetic algorithms (using libraries such as DEAP), or Bayesian
optimization to improve their accuracy and efficiency.
- Model deployment: Incorporate the optimized neural network architecture into a new AI system that can solve specific tasks or datasets effectively.
4. Continuous Optimization:
Steps to take:
- Monitoring in-situ performance: Implement mechanisms to collect feedback metrics from the deployed AI system's operation in real-time. This could involve integrating
logging and monitoring tools like TensorBoard, Weave, or Prometheus for tracking key metrics such as accuracy, response times, resource utilization, and error rates.
- Feedback processing: Use these real-time feedback metrics to update the meta-learner's understanding of effective architectures for various scenarios. Libraries like
NumPy and pandas can help process this data.
- Dynamic architecture updates: Utilize the updated insights from the meta-learner to periodically reevaluate and possibly modify the deployed neural network
architecture in real-time, improving the AI system's efficiency. This step could involve retraining the base model or applying dynamic optimization techniques
like pruning, quantization, or knowledge distillation.
- Model retraining: Incorporate feedback from the deployed AI system's performance into the base model's training process to further enhance its understanding of
effective neural network architectures across different tasks and problem domains. This step might involve revisiting the initial training stage with updated data
and improved architecture suggestions.
note from limin:
imma keep it 100. I need help with this. i been working on this idea for a while but im not the most skilled. someone please help |
nayohan/fmt-bench-inst | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: eval_indicator
dtype: string
splits:
- name: test
num_bytes: 72591
num_examples: 80
download_size: 36640
dataset_size: 72591
---
# Dataset Card for "fmt-bench-inst"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Survibagri/Bill-images-dataset | ---
license: gpl-2.0
---
|
RomanShp/MNIST-ResNet-Demo-Data | ---
language:
- en
pretty_name: MNIST ResNet Demo Storage
size_categories:
- n<1K
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
'5': '5'
'6': '6'
'7': '7'
'8': '8'
'9': '9'
splits:
- name: train
num_bytes: 571971.0
num_examples: 117
download_size: 139535
dataset_size: 571971.0
---
|
jusstinleee/speech2drug-livetest | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
- name: start_time
dtype: string
- name: end_time
dtype: string
splits:
- name: train
num_bytes: 1165001.0
num_examples: 5
- name: validation
num_bytes: 1770951.0
num_examples: 7
download_size: 2938816
dataset_size: 2935952.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
ilsp/flores200_en-el | ---
language:
- en
- el
license: cc-by-sa-4.0
size_categories:
- 1K<n<10K
task_categories:
- translation
dataset_info:
features:
- name: en
dtype: string
- name: el
dtype: string
splits:
- name: validation
num_bytes: 406555
num_examples: 997
- name: test
num_bytes: 427413
num_examples: 1012
download_size: 481524
dataset_size: 833968
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# FLORES-200 EN-EL with prompts for translation by LLMs
Based on [FLORES-200](https://huggingface.co/datasets/Muennighoff/flores200) dataset.
Publication:
@article{nllb2022,
author = {NLLB Team, Marta R. Costa-jussà, James Cross, Onur Çelebi, Maha Elbayad, Kenneth Heafield, Kevin Heffernan, Elahe Kalbassi, Janice Lam, Daniel Licht, Jean Maillard, Anna Sun, Skyler Wang, Guillaume Wenzek, Al Youngblood, Bapi Akula, Loic Barrault, Gabriel Mejia Gonzalez, Prangthip Hansanti, John Hoffman, Semarley Jarrett, Kaushik Ram Sadagopan, Dirk Rowe, Shannon Spruit, Chau Tran, Pierre Andrews, Necip Fazil Ayan, Shruti Bhosale, Sergey Edunov, Angela Fan, Cynthia Gao, Vedanuj Goswami, Francisco Guzmán, Philipp Koehn, Alexandre Mourachko, Christophe Ropers, Safiyyah Saleem, Holger Schwenk, Jeff Wang},
title = {No Language Left Behind: Scaling Human-Centered Machine Translation},
year = {2022}
}
Number of examples : 1012
## FLORES-200 for EN to EL with 0-shot prompts
Contains 2 prompt variants:
- EN:\n\[English Sentence\]\nEL:
- English:\n\[English Sentence\]\nΕλληνικά:
## FLORES-200 for EL to EN with 0-shot prompts
Contains 2 prompt variants:
- EL:\n\[Greek Sentence\]\nEL:
- Ελληνικά:\n\[Greek Sentence\]\nEnglish:
## How to load datasets
```python
from datasets import load_dataset
input_file = 'flores200.en2el.test.0-shot.json'
dataset = load_dataset(
'json',
data_files=input_file,
field='examples',
split='train'
)
```
## How to generate translation results with different configurations
```python
from multiprocessing import cpu_count
def generate_translations(datapoint, config, config_name):
for idx, variant in enumerate(datapoint["prompts_results"]):
# REPLACE generate WITH ACTUAL FUNCTION WHICH TAKES GENERATION CONFIG
result = generate(variant["prompt"], config=config)
datapoint["prompts_results"][idx].update({config_name: result})
return datapoint
dataset = dataset.map(
function=generate_translations,
fn_kwargs={"config": config, "config_name": config_name},
keep_in_memory=False,
num_proc=min(len(dataset), cpu_count()),
)
```
## How to push updated datasets to hub
```python
from huggingface_hub import HfApi
input_file = "flores200.en2el.test.0-shot.json"
model_name = "meltemi-v0.2"
output_file = input_file.replace(".json", ".{}.json".format(model_name)
dataset.to_json(output_file,
force_ascii=False,
indent=4,
orient="index")
api = HfApi()
api.upload_file(
path_or_fileobj=output_file,
path_in_repo="results/{}/{}".format(model_name, output_file)
repo_id="ilsp/flores200-en-el-prompt",
repo_type="dataset",
)
```
|
Riyazmk/mentalhealth | ---
license: other
---
|
apurvagup/ultrachat_hindi_seamless | ---
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 2761401316
num_examples: 185542
- name: test_sft
num_bytes: 147845678
num_examples: 10000
download_size: 952634359
dataset_size: 2909246994
---
# Dataset Card for "ultrachat_hindi_seamless"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/sr_3mp_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sr_3mp/SR-3MP/SR-3MP (Girls' Frontline)
This is the dataset of sr_3mp/SR-3MP/SR-3MP (Girls' Frontline), containing 139 images and their tags.
The core tags of this character are `blonde_hair, long_hair, hat, twintails, purple_eyes, braid, twin_braids, very_long_hair, bangs, breasts, small_breasts, hair_between_eyes, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 139 | 183.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sr_3mp_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 139 | 101.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sr_3mp_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 353 | 225.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sr_3mp_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 139 | 160.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sr_3mp_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 353 | 320.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sr_3mp_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sr_3mp_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, black_dress, looking_at_viewer, solo, blush, white_background, black_scarf, simple_background, white_thighhighs, garter_straps, official_alternate_costume, stuffed_animal, stuffed_bunny, tongue_out, black_headwear, sleeveless_dress, bare_shoulders, black_footwear, full_body, panties, shoes, sitting, smile |
| 1 | 14 |  |  |  |  |  | 1girl, blush, solo, fingerless_gloves, looking_at_viewer, navel, simple_background, open_shirt, white_background, black_panties, smile, black_gloves, pleated_skirt, sitting, black_necktie, flat_chest, sleeveless, tongue_out |
| 2 | 5 |  |  |  |  |  | 1girl, animal_hat, bare_shoulders, black_headwear, black_skirt, blush, bunny_hat, hair_ribbon, looking_at_viewer, navel, open_shirt, pleated_skirt, simple_background, sleeveless_shirt, solo, white_background, white_shirt, black_ribbon, closed_mouth, black_jacket, black_necktie, open_jacket, signature, sleeveless_jacket, black_gloves, blue_headwear, collared_shirt, fingerless_gloves, hair_ornament, miniskirt, rabbit_ears, short_necktie |
| 3 | 6 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, sex, solo_focus, nude, open_mouth, tongue_out, heart, penis, vaginal, bar_censor, collarbone, cum_in_pussy, hair_ornament, loli, navel, overflow, simple_background, spread_legs |
| 4 | 6 |  |  |  |  |  | 1girl, blush, hair_ornament, solo, kimono, looking_at_viewer, obi, open_mouth, detached_collar, off_shoulder, simple_background, white_background, bare_shoulders, bikini, collarbone, flower, frills, full_body, long_sleeves, submachine_gun |
| 5 | 11 |  |  |  |  |  | 1girl, animal_ear_fluff, looking_at_viewer, solo, hairclip, long_sleeves, glasses, serafuku, official_alternate_costume, blue_one-piece_swimsuit, blush, red_neckerchief, red-framed_eyewear, school_swimsuit, swimsuit_under_clothes, black_shirt, black_skirt, white_background, backpack, black_sailor_collar, holding_gun, rabbit_ears, round_eyewear, simple_background, sitting, smile, socks |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_dress | looking_at_viewer | solo | blush | white_background | black_scarf | simple_background | white_thighhighs | garter_straps | official_alternate_costume | stuffed_animal | stuffed_bunny | tongue_out | black_headwear | sleeveless_dress | bare_shoulders | black_footwear | full_body | panties | shoes | sitting | smile | fingerless_gloves | navel | open_shirt | black_panties | black_gloves | pleated_skirt | black_necktie | flat_chest | sleeveless | animal_hat | black_skirt | bunny_hat | hair_ribbon | sleeveless_shirt | white_shirt | black_ribbon | closed_mouth | black_jacket | open_jacket | signature | sleeveless_jacket | blue_headwear | collared_shirt | hair_ornament | miniskirt | rabbit_ears | short_necktie | 1boy | hetero | nipples | sex | solo_focus | nude | open_mouth | heart | penis | vaginal | bar_censor | collarbone | cum_in_pussy | loli | overflow | spread_legs | kimono | obi | detached_collar | off_shoulder | bikini | flower | frills | long_sleeves | submachine_gun | animal_ear_fluff | hairclip | glasses | serafuku | blue_one-piece_swimsuit | red_neckerchief | red-framed_eyewear | school_swimsuit | swimsuit_under_clothes | black_shirt | backpack | black_sailor_collar | holding_gun | round_eyewear | socks |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:--------------------|:-------|:--------|:-------------------|:--------------|:--------------------|:-------------------|:----------------|:-----------------------------|:-----------------|:----------------|:-------------|:-----------------|:-------------------|:-----------------|:-----------------|:------------|:----------|:--------|:----------|:--------|:--------------------|:--------|:-------------|:----------------|:---------------|:----------------|:----------------|:-------------|:-------------|:-------------|:--------------|:------------|:--------------|:-------------------|:--------------|:---------------|:---------------|:---------------|:--------------|:------------|:--------------------|:----------------|:-----------------|:----------------|:------------|:--------------|:----------------|:-------|:---------|:----------|:------|:-------------|:-------|:-------------|:--------|:--------|:----------|:-------------|:-------------|:---------------|:-------|:-----------|:--------------|:---------|:------|:------------------|:---------------|:---------|:---------|:---------|:---------------|:-----------------|:-------------------|:-----------|:----------|:-----------|:--------------------------|:------------------|:---------------------|:------------------|:-------------------------|:--------------|:-----------|:----------------------|:--------------|:----------------|:--------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | | X | X | X | X | | X | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | X | X | X | | X | | | | | | | X | | X | | | | | | | X | X | X | | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | | | X | | | X | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | X | X | X | X | | X | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | X | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 5 | 11 |  |  |  |  |  | X | | X | X | X | X | | X | | | X | | | | | | | | | | | X | X | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
BhabhaAI/Cross-Hindi-Hinglish-chat | ---
task_categories:
- text-generation
language:
- en
- hi
size_categories:
- 10K<n<100K
---
## Cross Hindi Hinglish Chat
This dataset is a subset of OpenHermes where some part is converted to either Hindi or Hinglish.
Note: This is in raw form. You must add "Reply in Hindi", "Reply in English" kind texts where appropriate.
row_ids correspond to row id starting from 0 for [OpenHermes English dataset](https://huggingface.co/datasets/teknium/OpenHermes-2.5). |
xwjzds/bbc-newskeywords | ---
dataset_info:
features:
- name: keyword
dtype: string
- name: score
dtype: float64
splits:
- name: train
num_bytes: 19513
num_examples: 1070
download_size: 17032
dataset_size: 19513
---
# Dataset Card for "bbc-newskeywords"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jw0303/test09 | ---
license: apache-2.0
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ivelin/ui_refexp_saved | ---
dataset_info:
features:
- name: image
dtype: image
- name: image_id
dtype: string
- name: image_file_path
dtype: string
- name: prompt
dtype: string
- name: target_bounding_box
dtype: string
splits:
- name: train
num_bytes: 1910805137.216
num_examples: 15624
- name: validation
num_bytes: 60403386
num_examples: 471
- name: test
num_bytes: 69078983
num_examples: 565
download_size: 1246541216
dataset_size: 2040287506.216
license: cc-by-4.0
task_categories:
- image-to-text
language:
- en
pretty_name: UIBert Referring Expressions Dataset
size_categories:
- 10K<n<100K
---
# Dataset Card for "ui_refexp_saved_Jan2023"
This is a saved snapshot of the dynamically generated [UI Bert](https://huggingface.co/datasets/ivelin/ui_refexp) dataset.
Much faster download time than the dynamic version which pulls and filters large data files from remote sources. |
open-llm-leaderboard/details_FPHam__Karen_TheEditor_V2_STRICT_Mistral_7B | ---
pretty_name: Evaluation run of FPHam/Karen_TheEditor_V2_STRICT_Mistral_7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FPHam/Karen_TheEditor_V2_STRICT_Mistral_7B](https://huggingface.co/FPHam/Karen_TheEditor_V2_STRICT_Mistral_7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 1 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FPHam__Karen_TheEditor_V2_STRICT_Mistral_7B\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-03T17:38:53.248093](https://huggingface.co/datasets/open-llm-leaderboard/details_FPHam__Karen_TheEditor_V2_STRICT_Mistral_7B/blob/main/results_2023-12-03T17-38-53.248093.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3017437452615618,\n\
\ \"acc_stderr\": 0.012643544762873351\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.3017437452615618,\n \"acc_stderr\": 0.012643544762873351\n\
\ }\n}\n```"
repo_url: https://huggingface.co/FPHam/Karen_TheEditor_V2_STRICT_Mistral_7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_03T17_38_53.248093
path:
- '**/details_harness|gsm8k|5_2023-12-03T17-38-53.248093.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-03T17-38-53.248093.parquet'
- config_name: results
data_files:
- split: 2023_12_03T17_38_53.248093
path:
- results_2023-12-03T17-38-53.248093.parquet
- split: latest
path:
- results_2023-12-03T17-38-53.248093.parquet
---
# Dataset Card for Evaluation run of FPHam/Karen_TheEditor_V2_STRICT_Mistral_7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/FPHam/Karen_TheEditor_V2_STRICT_Mistral_7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [FPHam/Karen_TheEditor_V2_STRICT_Mistral_7B](https://huggingface.co/FPHam/Karen_TheEditor_V2_STRICT_Mistral_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 1 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FPHam__Karen_TheEditor_V2_STRICT_Mistral_7B",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-03T17:38:53.248093](https://huggingface.co/datasets/open-llm-leaderboard/details_FPHam__Karen_TheEditor_V2_STRICT_Mistral_7B/blob/main/results_2023-12-03T17-38-53.248093.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3017437452615618,
"acc_stderr": 0.012643544762873351
},
"harness|gsm8k|5": {
"acc": 0.3017437452615618,
"acc_stderr": 0.012643544762873351
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mteb/amazon_reviews_multi | ---
language:
- de
- en
- es
- fr
- ja
- zh
--- |
JaspervanLeuven/normal_427 | ---
dataset_info:
features:
- name: scene_name
dtype: string
- name: ground_truth
dtype: image
- name: caption
dtype: string
- name: conditioning_images_one
dtype: image
- name: conditioning_images_two
dtype: image
- name: reference_image
dtype: string
- name: prescan_images
dtype: image
splits:
- name: train
num_bytes: 20604813.0
num_examples: 14
download_size: 20563564
dataset_size: 20604813.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hanifabdlh/Setfit-Multi-Duplicate-Sample-Dataset | ---
dataset_info:
features:
- name: sample_text
dtype: string
- name: label
dtype:
class_label:
names:
'0': affirm
'1': bot_challenge
'2': deny
'3': goodbye
'4': greet
'5': grxxnsmxrt_affirm
'6': grxxnsmxrt_bot_challenge
'7': grxxnsmxrt_deny
'8': grxxnsmxrt_goodbye
'9': grxxnsmxrt_greet
'10': grxxnsmxrt_mood_great
'11': grxxnsmxrt_mood_unhappy
'12': mood_great
'13': mood_unhappy
'14': xlfxmxrt_affirm
'15': xlfxmxrt_bot_challenge
'16': xlfxmxrt_deny
'17': xlfxmxrt_goodbye
'18': xlfxmxrt_greet
'19': xlfxmxrt_mood_great
'20': xlfxmxrt_mood_unhappy
splits:
- name: train
num_bytes: 6654
num_examples: 204
download_size: 4188
dataset_size: 6654
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HuggingFaceM4/Caltech-101 | ---
license: cc-by-4.0
---
## Code snippet to visualise the position of the box
```python
import matplotlib.image as img
import matplotlib.pyplot as plt
from datasets import load_dataset
from matplotlib.patches import Rectangle
# Load dataset
ds_name = "SaulLu/Caltech-101"
ds_config = "without_background_category"
ds_without = load_dataset(ds_name, ds_config, use_auth_token=True)
# Extract information for the sample we want to show
index = 100
sample = ds_without["train"][index]
box_coord = sample["annotation"]["box_coord"][0]
img_path = sample["image"].filename
# Create plot
# define Matplotlib figure and axis
fig, ax = plt.subplots()
# plot figure
image = img.imread(img_path)
ax.imshow(image)
# add rectangle to plot
ax.add_patch(
Rectangle((box_coord[2], box_coord[0]), box_coord[3] - box_coord[2], box_coord[1] - box_coord[0], fill=None)
)
# display plot
plt.show()
```
Result:
 |
nirajandhakal/realworldqa | ---
license: cc-by-nd-4.0
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: image
dtype: image
splits:
- name: test
num_bytes: 678377348
num_examples: 765
download_size: 678335644
dataset_size: 678377348
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
task_categories:
- visual-question-answering
language:
- en
pretty_name: RealWorldQA
---
# Real World QA Dataset
This is a benchmark dataset released by xAI under CC-by-nd-4.0 license along with Grok-1.5 Vision [Announcement](https://x.ai/blog/grok-1.5v).
This benchmark is designed to evaluate basic real-world spatial understanding capabilities of multimodal models.
While many of the examples in the current benchmark are relatively easy for humans, they often pose a challenge for frontier models.
This release of the RealWorldQA consists of 765 images, with a question and easily verifiable answer for each image.
The dataset consists of anonymized images taken from vehicles, in addition to other real-world images.
## License
CC BY-ND 4.0 |
CyberHarem/nero_claudius_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nero_claudius/ネロ・クラウディウス/尼禄·克劳狄乌斯 (Fate/Grand Order)
This is the dataset of nero_claudius/ネロ・クラウディウス/尼禄·克劳狄乌斯 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `blonde_hair, ahoge, green_eyes, breasts, hair_intakes, large_breasts, braid, ribbon, hair_bun, hair_between_eyes, hair_ribbon, single_hair_bun, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 784.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nero_claudius_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 692.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nero_claudius_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1258 | 1.27 GiB | [Download](https://huggingface.co/datasets/CyberHarem/nero_claudius_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nero_claudius_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, christmas, solo, santa_costume, looking_at_viewer, open_mouth, red_headwear, santa_hat, thighhighs, white_background, simple_background, blush, cleavage, fur-trimmed_headwear, long_sleeves, navel, french_braid, red_ribbon, :d, belt, fur-trimmed_capelet, holding, midriff, sack, shorts, thighs |
| 1 | 12 |  |  |  |  |  | 1girl, blue_sky, cleavage, cloud, criss-cross_halter, day, long_hair, navel, outdoors, smile, solo, striped_bikini, striped_clothes, bare_shoulders, bracelet, looking_at_viewer, side-tie_bikini_bottom, blush, red_bikini, closed_mouth, twintails, collarbone, water, white_ribbon |
| 2 | 6 |  |  |  |  |  | 1girl, aestus_estus, criss-cross_halter, long_hair, looking_at_viewer, navel, side-tie_bikini_bottom, smile, solo, striped_bikini, striped_clothes, cleavage, earrings, holding_sword, red_bikini, closed_mouth, petals, water, bare_shoulders |
| 3 | 13 |  |  |  |  |  | 1girl, cleavage, epaulettes, looking_at_viewer, red_dress, solo, petals, smile, juliet_sleeves, red_ribbon, open_mouth, see-through |
| 4 | 13 |  |  |  |  |  | 1girl, aestus_estus, epaulettes, holding_sword, looking_at_viewer, red_dress, solo, juliet_sleeves, petals, cleavage, open_mouth, red_ribbon, :d, french_braid, leotard, see-through, blush, standing |
| 5 | 7 |  |  |  |  |  | 1girl, solo, epaulettes, holding_flower, juliet_sleeves, looking_at_viewer, red_dress, red_rose, rose_petals, short_hair, smile, upper_body, blush, cleavage, red_ribbon, simple_background, white_background |
| 6 | 6 |  |  |  |  |  | 1girl, bare_shoulders, blush, cleavage, looking_at_viewer, necklace, official_alternate_costume, red_dress, red_gloves, solo, collarbone, hair_flower, red_rose, smile, striped_clothes, earrings, elbow_gloves, striped_dress, closed_mouth, couch, petals, red_bow, sitting |
| 7 | 15 |  |  |  |  |  | 1girl, gym_uniform, looking_at_viewer, official_alternate_costume, red_buruma, solo, french_braid, gym_shirt, short_sleeves, thighs, white_shirt, red_headband, blush, smile, name_tag, simple_background, closed_mouth, open_mouth, white_background, ass, looking_back |
| 8 | 12 |  |  |  |  |  | 1girl, belt, chain, padlock, solo, white_bodysuit, aestus_estus, looking_at_viewer, white_gloves, zipper, holding_sword, smile, bridal_veil, flower, closed_mouth |
| 9 | 9 |  |  |  |  |  | 1girl, belt, blush, bridal_veil, chain, flower, padlock, solo, white_bodysuit, white_gloves, zipper, looking_at_viewer, smile, closed_mouth, short_hair, simple_background, white_background |
| 10 | 44 |  |  |  |  |  | 1girl, solo, chain, cleavage, padlock, white_sleeves, looking_at_viewer, detached_collar, bare_shoulders, white_leotard, bridal_veil, white_thighhighs, smile, flower, white_gloves, wide_sleeves, blush, petals, strapless_leotard, closed_mouth, head_wreath, zipper_pull_tab, sidelocks, loose_belt, puffy_detached_sleeves, showgirl_skirt, sword, aestus_estus, full-length_zipper |
| 11 | 6 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, closed_mouth, collarbone, looking_at_viewer, navel, smile, solo, underwear_only, blush, bow, short_hair, lingerie, on_back, red_panties, rose_petals, arm_up, armpits, bed_sheet, lace-trimmed_bra, official_alternate_costume, pillow, plaid_panties, red_bra, red_rose, stomach, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | christmas | solo | santa_costume | looking_at_viewer | open_mouth | red_headwear | santa_hat | thighhighs | white_background | simple_background | blush | cleavage | fur-trimmed_headwear | long_sleeves | navel | french_braid | red_ribbon | :d | belt | fur-trimmed_capelet | holding | midriff | sack | shorts | thighs | blue_sky | cloud | criss-cross_halter | day | long_hair | outdoors | smile | striped_bikini | striped_clothes | bare_shoulders | bracelet | side-tie_bikini_bottom | red_bikini | closed_mouth | twintails | collarbone | water | white_ribbon | aestus_estus | earrings | holding_sword | petals | epaulettes | red_dress | juliet_sleeves | see-through | leotard | standing | holding_flower | red_rose | rose_petals | short_hair | upper_body | necklace | official_alternate_costume | red_gloves | hair_flower | elbow_gloves | striped_dress | couch | red_bow | sitting | gym_uniform | red_buruma | gym_shirt | short_sleeves | white_shirt | red_headband | name_tag | ass | looking_back | chain | padlock | white_bodysuit | white_gloves | zipper | bridal_veil | flower | white_sleeves | detached_collar | white_leotard | white_thighhighs | wide_sleeves | strapless_leotard | head_wreath | zipper_pull_tab | sidelocks | loose_belt | puffy_detached_sleeves | showgirl_skirt | sword | full-length_zipper | underwear_only | bow | lingerie | on_back | red_panties | arm_up | armpits | bed_sheet | lace-trimmed_bra | pillow | plaid_panties | red_bra | stomach |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:------------|:-------|:----------------|:--------------------|:-------------|:---------------|:------------|:-------------|:-------------------|:--------------------|:--------|:-----------|:-----------------------|:---------------|:--------|:---------------|:-------------|:-----|:-------|:----------------------|:----------|:----------|:-------|:---------|:---------|:-----------|:--------|:---------------------|:------|:------------|:-----------|:--------|:-----------------|:------------------|:-----------------|:-----------|:-------------------------|:-------------|:---------------|:------------|:-------------|:--------|:---------------|:---------------|:-----------|:----------------|:---------|:-------------|:------------|:-----------------|:--------------|:----------|:-----------|:-----------------|:-----------|:--------------|:-------------|:-------------|:-----------|:-----------------------------|:-------------|:--------------|:---------------|:----------------|:--------|:----------|:----------|:--------------|:-------------|:------------|:----------------|:--------------|:---------------|:-----------|:------|:---------------|:--------|:----------|:-----------------|:---------------|:---------|:--------------|:---------|:----------------|:------------------|:----------------|:-------------------|:---------------|:--------------------|:--------------|:------------------|:------------|:-------------|:-------------------------|:-----------------|:--------|:---------------------|:-----------------|:------|:-----------|:----------|:--------------|:---------|:----------|:------------|:-------------------|:---------|:----------------|:----------|:----------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | | X | | X | | | | | | | X | X | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | X | | X | | | | | | | | X | | | X | | | | | | | | | | | | | X | | X | | X | X | X | X | | X | X | X | | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 13 |  |  |  |  |  | X | | X | | X | X | | | | | | | X | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 13 |  |  |  |  |  | X | | X | | X | X | | | | | | X | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | | X | | X | | | | | X | X | X | X | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | X | | X | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | X | | X | X | | | | X | | X | | | | X | | X | | X | | | | | | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 15 |  |  |  |  |  | X | | X | | X | X | | | | X | X | X | | | | | X | | | | | | | | | X | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 12 |  |  |  |  |  | X | | X | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | X | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 9 |  |  |  |  |  | X | | X | | X | | | | | X | X | X | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 44 |  |  |  |  |  | X | | X | | X | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | X | | | X | | | | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 11 | 6 |  |  |  |  |  | X | | X | | X | | | | X | | | X | X | | | X | | | | | | | | | | | | | | | | | X | | | X | | | | X | | X | | | | | | | | | | | | | | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/a8dcce01 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 184
num_examples: 10
download_size: 1327
dataset_size: 184
---
# Dataset Card for "a8dcce01"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alexmaraval/svamp_optimize_examples | ---
dataset_info:
features:
- name: Equation
dtype: string
- name: Answer
dtype: float64
- name: Type
dtype: string
- name: Question
dtype: string
- name: Body
dtype: string
- name: ID
dtype: string
- name: question
dtype: string
- name: CoT_example
dtype: string
- name: rationale
dtype: string
- name: answer
dtype: string
- name: CoT_embedding
sequence: float64
- name: question_embedding
sequence: float64
- name: rationale_embedding
sequence: float64
- name: answer_embedding
sequence: float64
splits:
- name: train
num_bytes: 15138487.714285715
num_examples: 600
download_size: 11446640
dataset_size: 15138487.714285715
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
openskyml/wikipedia | ---
annotations_creators:
- no-annotation
language_creators:
- crowdsourced
pretty_name: Wikipedia
paperswithcode_id: null
license:
- cc-by-sa-3.0
- gfdl
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
source_datasets:
- original
multilinguality:
- multilingual
size_categories:
- n<1K
- 1K<n<10K
- 10K<n<100K
- 100K<n<1M
- 1M<n<10M
language:
- aa
- ab
- ace
- af
- ak
- als
- am
- an
- ang
- ar
- arc
- arz
- as
- ast
- atj
- av
- ay
- az
- azb
- ba
- bar
- bcl
- be
- bg
- bh
- bi
- bjn
- bm
- bn
- bo
- bpy
- br
- bs
- bug
- bxr
- ca
- cbk
- cdo
- ce
- ceb
- ch
- cho
- chr
- chy
- ckb
- co
- cr
- crh
- cs
- csb
- cu
- cv
- cy
- da
- de
- din
- diq
- dsb
- dty
- dv
- dz
- ee
- el
- eml
- en
- eo
- es
- et
- eu
- ext
- fa
- ff
- fi
- fj
- fo
- fr
- frp
- frr
- fur
- fy
- ga
- gag
- gan
- gd
- gl
- glk
- gn
- gom
- gor
- got
- gu
- gv
- ha
- hak
- haw
- he
- hi
- hif
- ho
- hr
- hsb
- ht
- hu
- hy
- ia
- id
- ie
- ig
- ii
- ik
- ilo
- inh
- io
- is
- it
- iu
- ja
- jam
- jbo
- jv
- ka
- kaa
- kab
- kbd
- kbp
- kg
- ki
- kj
- kk
- kl
- km
- kn
- ko
- koi
- krc
- ks
- ksh
- ku
- kv
- kw
- ky
- la
- lad
- lb
- lbe
- lez
- lfn
- lg
- li
- lij
- lmo
- ln
- lo
- lrc
- lt
- ltg
- lv
- lzh
- mai
- mdf
- mg
- mh
- mhr
- mi
- min
- mk
- ml
- mn
- mr
- mrj
- ms
- mt
- mus
- mwl
- my
- myv
- mzn
- na
- nah
- nan
- nap
- nds
- ne
- new
- ng
- nl
- nn
- 'no'
- nov
- nrf
- nso
- nv
- ny
- oc
- olo
- om
- or
- os
- pa
- pag
- pam
- pap
- pcd
- pdc
- pfl
- pi
- pih
- pl
- pms
- pnb
- pnt
- ps
- pt
- qu
- rm
- rmy
- rn
- ro
- ru
- rue
- rup
- rw
- sa
- sah
- sat
- sc
- scn
- sco
- sd
- se
- sg
- sgs
- sh
- si
- sk
- sl
- sm
- sn
- so
- sq
- sr
- srn
- ss
- st
- stq
- su
- sv
- sw
- szl
- ta
- tcy
- tdt
- te
- tg
- th
- ti
- tk
- tl
- tn
- to
- tpi
- tr
- ts
- tt
- tum
- tw
- ty
- tyv
- udm
- ug
- uk
- ur
- uz
- ve
- vec
- vep
- vi
- vls
- vo
- vro
- wa
- war
- wo
- wuu
- xal
- xh
- xmf
- yi
- yo
- yue
- za
- zea
- zh
- zu
language_bcp47:
- nds-nl
dataset_info:
- config_name: 20220301.de
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 8905282792
num_examples: 2665357
download_size: 6523215105
dataset_size: 8905282792
- config_name: 20220301.en
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 20275516160
num_examples: 6458670
download_size: 20598313936
dataset_size: 20275516160
- config_name: 20220301.fr
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 7375920768
num_examples: 2402095
download_size: 5602565274
dataset_size: 7375920768
- config_name: 20220301.frr
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 9129760
num_examples: 15199
download_size: 12438017
dataset_size: 9129760
- config_name: 20220301.it
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 4539944448
num_examples: 1743035
download_size: 3516441239
dataset_size: 4539944448
- config_name: 20220301.simple
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 235072360
num_examples: 205328
download_size: 239682796
dataset_size: 235072360
config_names:
- 20220301.aa
- 20220301.ab
- 20220301.ace
- 20220301.ady
- 20220301.af
- 20220301.ak
- 20220301.als
- 20220301.am
- 20220301.an
- 20220301.ang
- 20220301.ar
- 20220301.arc
- 20220301.arz
- 20220301.as
- 20220301.ast
- 20220301.atj
- 20220301.av
- 20220301.ay
- 20220301.az
- 20220301.azb
- 20220301.ba
- 20220301.bar
- 20220301.bat-smg
- 20220301.bcl
- 20220301.be
- 20220301.be-x-old
- 20220301.bg
- 20220301.bh
- 20220301.bi
- 20220301.bjn
- 20220301.bm
- 20220301.bn
- 20220301.bo
- 20220301.bpy
- 20220301.br
- 20220301.bs
- 20220301.bug
- 20220301.bxr
- 20220301.ca
- 20220301.cbk-zam
- 20220301.cdo
- 20220301.ce
- 20220301.ceb
- 20220301.ch
- 20220301.cho
- 20220301.chr
- 20220301.chy
- 20220301.ckb
- 20220301.co
- 20220301.cr
- 20220301.crh
- 20220301.cs
- 20220301.csb
- 20220301.cu
- 20220301.cv
- 20220301.cy
- 20220301.da
- 20220301.de
- 20220301.din
- 20220301.diq
- 20220301.dsb
- 20220301.dty
- 20220301.dv
- 20220301.dz
- 20220301.ee
- 20220301.el
- 20220301.eml
- 20220301.en
- 20220301.eo
- 20220301.es
- 20220301.et
- 20220301.eu
- 20220301.ext
- 20220301.fa
- 20220301.ff
- 20220301.fi
- 20220301.fiu-vro
- 20220301.fj
- 20220301.fo
- 20220301.fr
- 20220301.frp
- 20220301.frr
- 20220301.fur
- 20220301.fy
- 20220301.ga
- 20220301.gag
- 20220301.gan
- 20220301.gd
- 20220301.gl
- 20220301.glk
- 20220301.gn
- 20220301.gom
- 20220301.gor
- 20220301.got
- 20220301.gu
- 20220301.gv
- 20220301.ha
- 20220301.hak
- 20220301.haw
- 20220301.he
- 20220301.hi
- 20220301.hif
- 20220301.ho
- 20220301.hr
- 20220301.hsb
- 20220301.ht
- 20220301.hu
- 20220301.hy
- 20220301.ia
- 20220301.id
- 20220301.ie
- 20220301.ig
- 20220301.ii
- 20220301.ik
- 20220301.ilo
- 20220301.inh
- 20220301.io
- 20220301.is
- 20220301.it
- 20220301.iu
- 20220301.ja
- 20220301.jam
- 20220301.jbo
- 20220301.jv
- 20220301.ka
- 20220301.kaa
- 20220301.kab
- 20220301.kbd
- 20220301.kbp
- 20220301.kg
- 20220301.ki
- 20220301.kj
- 20220301.kk
- 20220301.kl
- 20220301.km
- 20220301.kn
- 20220301.ko
- 20220301.koi
- 20220301.krc
- 20220301.ks
- 20220301.ksh
- 20220301.ku
- 20220301.kv
- 20220301.kw
- 20220301.ky
- 20220301.la
- 20220301.lad
- 20220301.lb
- 20220301.lbe
- 20220301.lez
- 20220301.lfn
- 20220301.lg
- 20220301.li
- 20220301.lij
- 20220301.lmo
- 20220301.ln
- 20220301.lo
- 20220301.lrc
- 20220301.lt
- 20220301.ltg
- 20220301.lv
- 20220301.mai
- 20220301.map-bms
- 20220301.mdf
- 20220301.mg
- 20220301.mh
- 20220301.mhr
- 20220301.mi
- 20220301.min
- 20220301.mk
- 20220301.ml
- 20220301.mn
- 20220301.mr
- 20220301.mrj
- 20220301.ms
- 20220301.mt
- 20220301.mus
- 20220301.mwl
- 20220301.my
- 20220301.myv
- 20220301.mzn
- 20220301.na
- 20220301.nah
- 20220301.nap
- 20220301.nds
- 20220301.nds-nl
- 20220301.ne
- 20220301.new
- 20220301.ng
- 20220301.nl
- 20220301.nn
- 20220301.no
- 20220301.nov
- 20220301.nrm
- 20220301.nso
- 20220301.nv
- 20220301.ny
- 20220301.oc
- 20220301.olo
- 20220301.om
- 20220301.or
- 20220301.os
- 20220301.pa
- 20220301.pag
- 20220301.pam
- 20220301.pap
- 20220301.pcd
- 20220301.pdc
- 20220301.pfl
- 20220301.pi
- 20220301.pih
- 20220301.pl
- 20220301.pms
- 20220301.pnb
- 20220301.pnt
- 20220301.ps
- 20220301.pt
- 20220301.qu
- 20220301.rm
- 20220301.rmy
- 20220301.rn
- 20220301.ro
- 20220301.roa-rup
- 20220301.roa-tara
- 20220301.ru
- 20220301.rue
- 20220301.rw
- 20220301.sa
- 20220301.sah
- 20220301.sat
- 20220301.sc
- 20220301.scn
- 20220301.sco
- 20220301.sd
- 20220301.se
- 20220301.sg
- 20220301.sh
- 20220301.si
- 20220301.simple
- 20220301.sk
- 20220301.sl
- 20220301.sm
- 20220301.sn
- 20220301.so
- 20220301.sq
- 20220301.sr
- 20220301.srn
- 20220301.ss
- 20220301.st
- 20220301.stq
- 20220301.su
- 20220301.sv
- 20220301.sw
- 20220301.szl
- 20220301.ta
- 20220301.tcy
- 20220301.te
- 20220301.tet
- 20220301.tg
- 20220301.th
- 20220301.ti
- 20220301.tk
- 20220301.tl
- 20220301.tn
- 20220301.to
- 20220301.tpi
- 20220301.tr
- 20220301.ts
- 20220301.tt
- 20220301.tum
- 20220301.tw
- 20220301.ty
- 20220301.tyv
- 20220301.udm
- 20220301.ug
- 20220301.uk
- 20220301.ur
- 20220301.uz
- 20220301.ve
- 20220301.vec
- 20220301.vep
- 20220301.vi
- 20220301.vls
- 20220301.vo
- 20220301.wa
- 20220301.war
- 20220301.wo
- 20220301.wuu
- 20220301.xal
- 20220301.xh
- 20220301.xmf
- 20220301.yi
- 20220301.yo
- 20220301.za
- 20220301.zea
- 20220301.zh
- 20220301.zh-classical
- 20220301.zh-min-nan
- 20220301.zh-yue
- 20220301.zu
---
# Dataset Card for Wikipedia
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://dumps.wikimedia.org](https://dumps.wikimedia.org)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Dataset Summary
Wikipedia dataset containing cleaned articles of all languages.
The datasets are built from the Wikipedia dump
(https://dumps.wikimedia.org/) with one split per language. Each example
contains the content of one full Wikipedia article with cleaning to strip
markdown and unwanted sections (references, etc.).
The articles are parsed using the ``mwparserfromhell`` tool.
To load this dataset you need to install Apache Beam and ``mwparserfromhell`` first:
```
pip install apache_beam mwparserfromhell
```
Then, you can load any subset of Wikipedia per language and per date this way:
```python
from datasets import load_dataset
load_dataset("wikipedia", language="sw", date="20220120", beam_runner=...)
```
where you can pass as `beam_runner` any Apache Beam supported runner for (distributed) data processing
(see [here](https://beam.apache.org/documentation/runners/capability-matrix/)).
Pass "DirectRunner" to run it on your machine.
You can find the full list of languages and dates [here](https://dumps.wikimedia.org/backup-index.html).
Some subsets of Wikipedia have already been processed by HuggingFace, and you can load them just with:
```python
from datasets import load_dataset
load_dataset("wikipedia", "20220301.en")
```
The list of pre-processed subsets is:
- "20220301.de"
- "20220301.en"
- "20220301.fr"
- "20220301.frr"
- "20220301.it"
- "20220301.simple"
### Supported Tasks and Leaderboards
The dataset is generally used for Language Modeling.
### Languages
You can find the list of languages [here](https://meta.wikimedia.org/wiki/List_of_Wikipedias).
## Dataset Structure
### Data Instances
An example looks as follows:
```
{'id': '1',
'url': 'https://simple.wikipedia.org/wiki/April',
'title': 'April',
'text': 'April is the fourth month...'
}
```
Some subsets of Wikipedia have already been processed by HuggingFace, as you can see below:
#### 20220301.de
- **Size of downloaded dataset files:** 6.84 GB
- **Size of the generated dataset:** 9.34 GB
- **Total amount of disk used:** 16.18 GB
#### 20220301.en
- **Size of downloaded dataset files:** 21.60 GB
- **Size of the generated dataset:** 21.26 GB
- **Total amount of disk used:** 42.86 GB
#### 20220301.fr
- **Size of downloaded dataset files:** 5.87 GB
- **Size of the generated dataset:** 7.73 GB
- **Total amount of disk used:** 13.61 GB
#### 20220301.frr
- **Size of downloaded dataset files:** 13.04 MB
- **Size of the generated dataset:** 9.57 MB
- **Total amount of disk used:** 22.62 MB
#### 20220301.it
- **Size of downloaded dataset files:** 3.69 GB
- **Size of the generated dataset:** 4.76 GB
- **Total amount of disk used:** 8.45 GB
#### 20220301.simple
- **Size of downloaded dataset files:** 251.32 MB
- **Size of the generated dataset:** 246.49 MB
- **Total amount of disk used:** 497.82 MB
### Data Fields
The data fields are the same among all configurations:
- `id` (`str`): ID of the article.
- `url` (`str`): URL of the article.
- `title` (`str`): Title of the article.
- `text` (`str`): Text content of the article.
### Data Splits
Here are the number of examples for several configurations:
| name | train |
|-----------------|--------:|
| 20220301.de | 2665357 |
| 20220301.en | 6458670 |
| 20220301.fr | 2402095 |
| 20220301.frr | 15199 |
| 20220301.it | 1743035 |
| 20220301.simple | 205328 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
Most of Wikipedia's text and many of its images are co-licensed under the
[Creative Commons Attribution-ShareAlike 3.0 Unported License](https://en.wikipedia.org/wiki/Wikipedia:Text_of_Creative_Commons_Attribution-ShareAlike_3.0_Unported_License)
(CC BY-SA) and the [GNU Free Documentation License](https://en.wikipedia.org/wiki/Wikipedia:Text_of_the_GNU_Free_Documentation_License)
(GFDL) (unversioned, with no invariant sections, front-cover texts, or back-cover texts).
Some text has been imported only under CC BY-SA and CC BY-SA-compatible license and cannot be reused under GFDL; such
text will be identified on the page footer, in the page history, or on the discussion page of the article that utilizes
the text.
### Citation Information
```
@ONLINE{wikidump,
author = "Wikimedia Foundation",
title = "Wikimedia Downloads",
url = "https://dumps.wikimedia.org"
}
```
### Contributions
Thanks to [@lewtun](https://github.com/lewtun), [@mariamabarham](https://github.com/mariamabarham), [@thomwolf](https://github.com/thomwolf), [@lhoestq](https://github.com/lhoestq), [@patrickvonplaten](https://github.com/patrickvonplaten) for adding this dataset. |
tyzhu/lmind_nq_train10000_eval6489_v1_reciteonly_qa | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train_qa
num_bytes: 1159729
num_examples: 10000
- name: train_recite_qa
num_bytes: 7573876
num_examples: 10000
- name: eval_qa
num_bytes: 752802
num_examples: 6489
- name: eval_recite_qa
num_bytes: 4912675
num_examples: 6489
- name: all_docs
num_bytes: 9144930
num_examples: 14014
- name: all_docs_eval
num_bytes: 9144126
num_examples: 14014
- name: train
num_bytes: 7573876
num_examples: 10000
- name: validation
num_bytes: 4912675
num_examples: 6489
download_size: 27978361
dataset_size: 45174689
---
# Dataset Card for "lmind_nq_train10000_eval6489_v1_reciteonly_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JotDe/data-members-2k | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 291890903.6885961
num_examples: 2000
download_size: 256657803
dataset_size: 291890903.6885961
---
# Dataset Card for "data-members-2k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
smeintadmin/image_intents | ---
license: apache-2.0
---
|
runningsnake/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: milestone
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: labels_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: description
dtype: string
- name: creator
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: open_issues
dtype: int64
- name: closed_issues
dtype: int64
- name: state
dtype: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: due_on
dtype: 'null'
- name: closed_at
dtype: 'null'
- name: comments
sequence: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: timestamp[s]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 14798625
num_examples: 2000
download_size: 4053110
dataset_size: 14798625
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- text-retrieval
- text-classification
language:
- en
pretty_name: Hugging Face GitHub Issues
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yakuplucilingirnet/yakuplucilingir.net | ---
license: apache-2.0
---
|
CyberHarem/pamiat_merkuria_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of pamiat_merkuria/パーミャチ・メルクーリヤ/水星纪念 (Azur Lane)
This is the dataset of pamiat_merkuria/パーミャチ・メルクーリヤ/水星纪念 (Azur Lane), containing 349 images and their tags.
The core tags of this character are `long_hair, breasts, large_breasts, black_hair, purple_eyes, bangs, hat, one_side_up, white_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 349 | 549.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pamiat_merkuria_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 349 | 279.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pamiat_merkuria_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 919 | 648.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pamiat_merkuria_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 349 | 473.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pamiat_merkuria_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 919 | 999.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pamiat_merkuria_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pamiat_merkuria_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 32 |  |  |  |  |  | 1girl, long_sleeves, solo, white_coat, black_gloves, blush, looking_at_viewer, open_mouth, black_thighhighs, fur-trimmed_coat, cleavage, white_background, simple_background, :d, fang |
| 1 | 9 |  |  |  |  |  | 1girl, blush, looking_at_viewer, smile, solo, white_gloves, fur_trim, pantyhose, black_footwear, boots, coat, full_body, sideboob, bare_shoulders, simple_background, very_long_hair, :p, retrofit_(azur_lane), standing |
| 2 | 28 |  |  |  |  |  | 1girl, prison_clothes, blush, solo, looking_at_viewer, long_sleeves, striped_headwear, chain, open_mouth, bare_shoulders, white_thighhighs, torn_thighhighs, medium_breasts, striped_shirt, off_shoulder, cuffs, cleavage, red_eyes, fang, smile |
| 3 | 17 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, solo_focus, penis, sex, vaginal, open_mouth, pussy, thighhighs, navel, looking_at_viewer, mosaic_censoring, sweat, cowgirl_position, girl_on_top, nude |
| 4 | 11 |  |  |  |  |  | 1girl, blush, cat_ears, paw_gloves, solo, looking_at_viewer, cleavage, naked_apron, smile, black_thighhighs, cat_tail, hair_ornament, open_mouth, fake_animal_ears, fang, heart, pink_eyes, armpits, chocolate_on_breasts, pillow, thighs, white_apron, brown_hair, cat_paws, lying, tail_ornament |
| 5 | 17 |  |  |  |  |  | 1girl, blush, looking_at_viewer, official_alternate_costume, open_cardigan, solo, off_shoulder, bare_shoulders, cleavage, black_hairband, collarbone, open_mouth, black_choker, cherry, long_sleeves, smile, shorts, white_camisole, navel, grey_hair, midriff, ribbon, sitting |
| 6 | 20 |  |  |  |  |  | 1girl, fake_animal_ears, looking_at_viewer, rabbit_ears, playboy_bunny, solo, blush, bare_shoulders, hairclip, purple_jacket, black_leotard, underboob_cutout, covered_navel, simple_background, smile, open_mouth, white_background, wrist_cuffs, bow, cowboy_shot, off_shoulder, open_clothes, rabbit_tail |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | solo | white_coat | black_gloves | blush | looking_at_viewer | open_mouth | black_thighhighs | fur-trimmed_coat | cleavage | white_background | simple_background | :d | fang | smile | white_gloves | fur_trim | pantyhose | black_footwear | boots | coat | full_body | sideboob | bare_shoulders | very_long_hair | :p | retrofit_(azur_lane) | standing | prison_clothes | striped_headwear | chain | white_thighhighs | torn_thighhighs | medium_breasts | striped_shirt | off_shoulder | cuffs | red_eyes | 1boy | hetero | nipples | solo_focus | penis | sex | vaginal | pussy | thighhighs | navel | mosaic_censoring | sweat | cowgirl_position | girl_on_top | nude | cat_ears | paw_gloves | naked_apron | cat_tail | hair_ornament | fake_animal_ears | heart | pink_eyes | armpits | chocolate_on_breasts | pillow | thighs | white_apron | brown_hair | cat_paws | lying | tail_ornament | official_alternate_costume | open_cardigan | black_hairband | collarbone | black_choker | cherry | shorts | white_camisole | grey_hair | midriff | ribbon | sitting | rabbit_ears | playboy_bunny | hairclip | purple_jacket | black_leotard | underboob_cutout | covered_navel | wrist_cuffs | bow | cowboy_shot | open_clothes | rabbit_tail |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------|:-------------|:---------------|:--------|:--------------------|:-------------|:-------------------|:-------------------|:-----------|:-------------------|:--------------------|:-----|:-------|:--------|:---------------|:-----------|:------------|:-----------------|:--------|:-------|:------------|:-----------|:-----------------|:-----------------|:-----|:-----------------------|:-----------|:-----------------|:-------------------|:--------|:-------------------|:------------------|:-----------------|:----------------|:---------------|:--------|:-----------|:-------|:---------|:----------|:-------------|:--------|:------|:----------|:--------|:-------------|:--------|:-------------------|:--------|:-------------------|:--------------|:-------|:-----------|:-------------|:--------------|:-----------|:----------------|:-------------------|:--------|:------------|:----------|:-----------------------|:---------|:---------|:--------------|:-------------|:-----------|:--------|:----------------|:-----------------------------|:----------------|:-----------------|:-------------|:---------------|:---------|:---------|:-----------------|:------------|:----------|:---------|:----------|:--------------|:----------------|:-----------|:----------------|:----------------|:-------------------|:----------------|:--------------|:------|:--------------|:---------------|:--------------|
| 0 | 32 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | | X | | | X | X | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 28 |  |  |  |  |  | X | X | X | | | X | X | X | | | X | | | | X | X | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 17 |  |  |  |  |  | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 11 |  |  |  |  |  | X | | X | | | X | X | X | X | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 17 |  |  |  |  |  | X | X | X | | | X | X | X | | | X | | | | | X | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 6 | 20 |  |  |  |  |  | X | | X | | | X | X | X | | | | X | X | | | X | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
irds/hc4_ru | ---
pretty_name: '`hc4/ru`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `hc4/ru`
The `hc4/ru` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/hc4#hc4/ru).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=4,721,064
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/hc4_ru', 'docs')
for record in docs:
record # {'doc_id': ..., 'title': ..., 'text': ..., 'url': ..., 'time': ..., 'cc_file': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Lawrie2022HC4,
author = {Dawn Lawrie and James Mayfield and Douglas W. Oard and Eugene Yang},
title = {HC4: A New Suite of Test Collections for Ad Hoc CLIR},
booktitle = {{Advances in Information Retrieval. 44th European Conference on IR Research (ECIR 2022)},
year = {2022},
month = apr,
publisher = {Springer},
series = {Lecture Notes in Computer Science},
site = {Stavanger, Norway},
url = {https://arxiv.org/abs/2201.09992}
}
```
|
juewang/misc-data | ---
language:
- en
---
# juewang/target-data |
ConseggioLigure/seed-instruct-lij-eng | ---
license: cc-by-sa-4.0
task_categories:
- conversational
- translation
pretty_name: OLDI Seed lij-eng translation dataset (instruction-style)
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: template_id
dtype: int64
- name: template_lang
sequence: string
splits:
- name: train
num_bytes: 2381132
num_examples: 5802
- name: dev
num_bytes: 79921
num_examples: 189
- name: test
num_bytes: 87507
num_examples: 202
download_size: 1292161
dataset_size: 2548560
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
This is an Ligurian→English sentence-level translation dataset.
The original data comes from the [OLDI](https://www.oldi.org) [Seed dataset](https://github.com/openlanguagedata/seed), and it has been converted to the instruction format.
The prompts, written in Ligurian, ask the model to translate the text to English. There are several variants of the prompt templates which were randomly sampled for each sentence:
```
Traduxi in ingleise: \<sentence>
Traduxi da-o zeneise à l’ingleise: \<sentence>
Traduxi da-o ligure à l’ingleise: \<sentence>
Traduxi sto testo in ingleise: \<sentence>
Traduxi in lengua ingleise: \<sentence>
Traduxi sto testo da-o zeneise à l’ingleise: \<sentence>
Traduxi sto testo da-o ligure à l’ingleise: \<sentence>
Comm’à l’é a traduçion ingleise de sto testo? \<sentence>
Quæ a l’é a traduçion ingleise de sto testo? \<sentence>
Ti peu tradue sto testo in ingleise? \<sentence>
```
The prompt template used for each dataset entry is referenced in the column `template_id`, with ids ranging from 1 to 10 according to the order given above.
The targets are always prefixed with the string _"A traduçion in ingleise do testo a l’é: \<sentence>"_ ("The English translation of the sentence is:").
The correspondence between `template_id`, prompt template and target template is therefore:
```
[
(1, "Traduxi in ingleise:\n", ""A traduçion in ingleise do testo a l’é:\n"),
(2, "Traduxi da-o zeneise à l’ingleise:\n", ""A traduçion in ingleise do testo a l’é:\n"),
(3, "Traduxi da-o ligure à l’ingleise:\n", ""A traduçion in ingleise do testo a l’é:\n"),
(4, "Traduxi sto testo in ingleise:\n", ""A traduçion in ingleise do testo a l’é:\n"),
(5, "Traduxi in lengua ingleise:\n", ""A traduçion in ingleise do testo a l’é:\n"),
(6, "Traduxi sto testo da-o zeneise à l’ingleise:\n", ""A traduçion in ingleise do testo a l’é:\n"),
(7, "Traduxi sto testo da-o ligure à l’ingleise:\n", ""A traduçion in ingleise do testo a l’é:\n"),
(8, "Comm’à l’é a traduçion ingleise de sto testo?\n", ""A traduçion in ingleise do testo a l’é:\n"),
(9, "Quæ a l’é a traduçion ingleise de sto testo?\n", ""A traduçion in ingleise do testo a l’é:\n"),
(10, "Ti peu tradue sto testo in ingleise?\n", ""A traduçion in ingleise do testo a l’é:\n"),
]
```
The dataset contains 5802 train samples, 190 validation samples and 201 test samples. |
WahtsMyName/kdd2023-FR | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 181342267
num_examples: 117561
download_size: 82064276
dataset_size: 181342267
---
# Dataset Card for "kdd2023-FR"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
modelloosrvcc/melodie | ---
license: openrail
---
|
tyzhu/squad_baseline_train_10_eval_10 | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 52389
num_examples: 51
- name: validation
num_bytes: 58313
num_examples: 48
download_size: 0
dataset_size: 110702
---
# Dataset Card for "squad_baseline_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/ujiie_mutsumi_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ujiie_mutsumi/氏家むつみ (THE iDOLM@STER: Cinderella Girls)
This is the dataset of ujiie_mutsumi/氏家むつみ (THE iDOLM@STER: Cinderella Girls), containing 30 images and their tags.
The core tags of this character are `black_hair, long_hair, bangs, braid, blunt_bangs, blue_eyes, single_braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 30 | 22.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ujiie_mutsumi_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 30 | 18.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ujiie_mutsumi_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 64 | 34.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ujiie_mutsumi_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 30 | 21.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ujiie_mutsumi_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 64 | 39.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ujiie_mutsumi_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ujiie_mutsumi_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, solo, smile, open_mouth, earrings, hat, skirt, thighhighs, belt, card_(medium), character_name, gem_(symbol), necklace |
| 1 | 13 |  |  |  |  |  | 1girl, blush, solo, looking_at_viewer, open_mouth, smile, long_sleeves, hair_ornament, hair_over_shoulder, simple_background, sweat, white_background, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | open_mouth | earrings | hat | skirt | thighhighs | belt | card_(medium) | character_name | gem_(symbol) | necklace | blush | looking_at_viewer | long_sleeves | hair_ornament | hair_over_shoulder | simple_background | sweat | white_background | white_shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-------------|:-----------|:------|:--------|:-------------|:-------|:----------------|:-----------------|:---------------|:-----------|:--------|:--------------------|:---------------|:----------------|:---------------------|:--------------------|:--------|:-------------------|:--------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | X | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
danjacobellis/aria_ea_audio | ---
dataset_info:
features:
- name: audio
dtype:
array2_d:
shape:
- 300000
- 7
dtype: int32
- name: seq_name
dtype: string
splits:
- name: loc3_script4_seq4_rec1
num_bytes: 806402520
num_examples: 84
- name: loc4_script1_seq3_rec1
num_bytes: 163200510
num_examples: 17
- name: loc4_script2_seq7_rec1
num_bytes: 192000600
num_examples: 20
- name: loc5_script5_seq7_rec1
num_bytes: 403201260
num_examples: 42
- name: loc3_script2_seq5_rec1
num_bytes: 201600630
num_examples: 21
- name: loc3_script5_seq7_rec1
num_bytes: 211200660
num_examples: 22
- name: loc2_script2_seq5_rec2
num_bytes: 393601230
num_examples: 41
- name: loc1_script1_seq5_rec1
num_bytes: 182400570
num_examples: 19
- name: loc3_script2_seq4_rec2
num_bytes: 172800540
num_examples: 18
- name: loc3_script1_seq6_rec1
num_bytes: 144000450
num_examples: 15
- name: loc4_script3_seq1_rec2
num_bytes: 96000300
num_examples: 10
- name: loc2_script2_seq8_rec2
num_bytes: 144000450
num_examples: 15
- name: loc2_script1_seq1_rec1
num_bytes: 307200960
num_examples: 32
- name: loc4_script2_seq6_rec1
num_bytes: 67200210
num_examples: 7
- name: loc1_script1_seq7_rec1
num_bytes: 393601230
num_examples: 41
- name: loc2_script4_seq3_rec1
num_bytes: 345601080
num_examples: 36
- name: loc3_script2_seq3_rec2
num_bytes: 288000900
num_examples: 30
- name: loc3_script3_seq5_rec2
num_bytes: 288000900
num_examples: 30
- name: loc1_script2_seq8_rec2
num_bytes: 153600480
num_examples: 16
- name: loc1_script4_seq2_rec1
num_bytes: 403201260
num_examples: 42
- name: loc3_script3_seq4_rec1
num_bytes: 432001350
num_examples: 45
- name: loc3_script3_seq2_rec2
num_bytes: 259200810
num_examples: 27
- name: loc1_script1_seq3_rec1
num_bytes: 576001800
num_examples: 60
- name: loc3_script3_seq1_rec1
num_bytes: 96000300
num_examples: 10
- name: loc1_script2_seq3_rec2
num_bytes: 297600930
num_examples: 31
- name: loc1_script2_seq7_rec1
num_bytes: 403201260
num_examples: 42
- name: loc2_script2_seq4_rec1
num_bytes: 153600480
num_examples: 16
- name: loc1_script2_seq8_rec1
num_bytes: 144000450
num_examples: 15
- name: loc2_script2_seq1_rec2
num_bytes: 201600630
num_examples: 21
- name: loc5_script5_seq1_rec1
num_bytes: 192000600
num_examples: 20
- name: loc2_script5_seq4_rec1
num_bytes: 38400120
num_examples: 4
- name: loc2_script2_seq2_rec1
num_bytes: 163200510
num_examples: 17
- name: loc4_script5_seq3_rec1
num_bytes: 240000750
num_examples: 25
- name: loc2_script5_seq3_rec1
num_bytes: 96000300
num_examples: 10
- name: loc1_script2_seq4_rec2
num_bytes: 316800990
num_examples: 33
- name: loc2_script2_seq2_rec2
num_bytes: 153600480
num_examples: 16
- name: loc1_script4_seq4_rec1
num_bytes: 710402220
num_examples: 74
- name: loc2_script1_seq2_rec1
num_bytes: 240000750
num_examples: 25
- name: loc3_script2_seq3_rec1
num_bytes: 288000900
num_examples: 30
- name: loc2_script3_seq4_rec2
num_bytes: 384001200
num_examples: 40
- name: loc3_script2_seq1_rec2
num_bytes: 172800540
num_examples: 18
- name: loc2_script2_seq5_rec1
num_bytes: 384001200
num_examples: 40
- name: loc1_script3_seq5_rec1
num_bytes: 547201710
num_examples: 57
- name: loc3_script5_seq5_rec1
num_bytes: 211200660
num_examples: 22
- name: loc2_script5_seq7_rec1
num_bytes: 211200660
num_examples: 22
- name: loc2_script5_seq5_rec1
num_bytes: 259200810
num_examples: 27
- name: loc4_script3_seq4_rec1
num_bytes: 105600330
num_examples: 11
- name: loc5_script4_seq4_rec1
num_bytes: 470401470
num_examples: 49
- name: loc4_script2_seq1_rec2
num_bytes: 134400420
num_examples: 14
- name: loc3_script4_seq5_rec1
num_bytes: 211200660
num_examples: 22
- name: loc4_script2_seq3_rec2
num_bytes: 278400870
num_examples: 29
- name: loc2_script2_seq3_rec2
num_bytes: 115200360
num_examples: 12
- name: loc1_script2_seq6_rec2
num_bytes: 211200660
num_examples: 22
- name: loc1_script5_seq3_rec1
num_bytes: 192000600
num_examples: 20
- name: loc3_script2_seq7_rec2
num_bytes: 144000450
num_examples: 15
- name: loc2_script1_seq5_rec1
num_bytes: 240000750
num_examples: 25
- name: loc1_script5_seq6_rec1
num_bytes: 288000900
num_examples: 30
- name: loc3_script2_seq5_rec2
num_bytes: 201600630
num_examples: 21
- name: loc3_script1_seq7_rec1
num_bytes: 163200510
num_examples: 17
- name: loc1_script2_seq6_rec1
num_bytes: 192000600
num_examples: 20
- name: loc1_script5_seq2_rec1
num_bytes: 105600330
num_examples: 11
- name: loc5_script4_seq6_rec1
num_bytes: 326401020
num_examples: 34
- name: loc2_script5_seq2_rec1
num_bytes: 105600330
num_examples: 11
- name: loc4_script2_seq2_rec1
num_bytes: 134400420
num_examples: 14
- name: loc2_script4_seq7_rec1
num_bytes: 345601080
num_examples: 36
- name: loc4_script1_seq6_rec1
num_bytes: 182400570
num_examples: 19
- name: loc2_script5_seq6_rec1
num_bytes: 172800540
num_examples: 18
- name: loc3_script4_seq7_rec1
num_bytes: 259200810
num_examples: 27
- name: loc4_script2_seq4_rec1
num_bytes: 192000600
num_examples: 20
- name: loc5_script4_seq5_rec1
num_bytes: 787202460
num_examples: 82
- name: loc3_script2_seq4_rec1
num_bytes: 172800540
num_examples: 18
- name: loc1_script1_seq6_rec1
num_bytes: 240000750
num_examples: 25
- name: loc2_script1_seq4_rec1
num_bytes: 451201410
num_examples: 47
- name: loc1_script2_seq1_rec1
num_bytes: 182400570
num_examples: 19
- name: loc1_script2_seq4_rec1
num_bytes: 326401020
num_examples: 34
- name: loc4_script1_seq5_rec1
num_bytes: 230400720
num_examples: 24
- name: loc3_script4_seq2_rec1
num_bytes: 470401470
num_examples: 49
- name: loc3_script3_seq1_rec2
num_bytes: 96000300
num_examples: 10
- name: loc5_script4_seq1_rec1
num_bytes: 259200810
num_examples: 27
- name: loc4_script4_seq2_rec1
num_bytes: 144000450
num_examples: 15
- name: loc1_script4_seq3_rec1
num_bytes: 374401170
num_examples: 39
- name: loc4_script5_seq1_rec1
num_bytes: 76800240
num_examples: 8
- name: loc2_script1_seq3_rec1
num_bytes: 326401020
num_examples: 34
- name: loc2_script3_seq4_rec1
num_bytes: 384001200
num_examples: 40
- name: loc2_script2_seq4_rec2
num_bytes: 144000450
num_examples: 15
- name: loc5_script5_seq4_rec1
num_bytes: 288000900
num_examples: 30
- name: loc2_script4_seq5_rec1
num_bytes: 374401170
num_examples: 39
- name: loc2_script4_seq4_rec1
num_bytes: 1536004800
num_examples: 160
- name: loc3_script1_seq1_rec1
num_bytes: 230400720
num_examples: 24
- name: loc2_script3_seq2_rec1
num_bytes: 124800390
num_examples: 13
- name: loc2_script1_seq6_rec1
num_bytes: 297600930
num_examples: 31
- name: loc5_script4_seq3_rec1
num_bytes: 576001800
num_examples: 60
- name: loc3_script1_seq2_rec1
num_bytes: 345601080
num_examples: 36
- name: loc3_script2_seq1_rec1
num_bytes: 172800540
num_examples: 18
- name: loc2_script3_seq5_rec2
num_bytes: 345601080
num_examples: 36
- name: loc3_script1_seq5_rec1
num_bytes: 268800840
num_examples: 28
- name: loc1_script2_seq3_rec1
num_bytes: 297600930
num_examples: 31
- name: loc3_script3_seq4_rec2
num_bytes: 422401320
num_examples: 44
- name: loc2_script3_seq3_rec2
num_bytes: 1017603180
num_examples: 106
- name: loc2_script1_seq7_rec1
num_bytes: 163200510
num_examples: 17
- name: loc4_script5_seq7_rec1
num_bytes: 172800540
num_examples: 18
- name: loc2_script2_seq6_rec2
num_bytes: 96000300
num_examples: 10
- name: loc3_script2_seq2_rec1
num_bytes: 211200660
num_examples: 22
- name: loc1_script3_seq2_rec1
num_bytes: 124800390
num_examples: 13
- name: loc5_script4_seq2_rec1
num_bytes: 412801290
num_examples: 43
- name: loc4_script3_seq3_rec1
num_bytes: 700802190
num_examples: 73
- name: loc2_script2_seq8_rec1
num_bytes: 144000450
num_examples: 15
- name: loc1_script4_seq5_rec1
num_bytes: 384001200
num_examples: 40
- name: loc3_script3_seq2_rec1
num_bytes: 259200810
num_examples: 27
- name: loc2_script2_seq6_rec1
num_bytes: 96000300
num_examples: 10
- name: loc2_script3_seq2_rec2
num_bytes: 124800390
num_examples: 13
- name: loc2_script3_seq5_rec1
num_bytes: 364801140
num_examples: 38
- name: loc2_script3_seq1_rec1
num_bytes: 48000150
num_examples: 5
- name: loc3_script5_seq6_rec1
num_bytes: 240000750
num_examples: 25
- name: loc3_script3_seq5_rec1
num_bytes: 288000900
num_examples: 30
- name: loc1_script5_seq5_rec1
num_bytes: 451201410
num_examples: 47
- name: loc5_script5_seq3_rec1
num_bytes: 316800990
num_examples: 33
- name: loc3_script5_seq4_rec1
num_bytes: 105600330
num_examples: 11
- name: loc2_script3_seq3_rec1
num_bytes: 1017603180
num_examples: 106
- name: loc3_script5_seq3_rec1
num_bytes: 172800540
num_examples: 18
- name: loc3_script2_seq7_rec1
num_bytes: 153600480
num_examples: 16
- name: loc1_script1_seq1_rec1
num_bytes: 480001500
num_examples: 50
- name: loc3_script5_seq2_rec1
num_bytes: 153600480
num_examples: 16
- name: loc3_script1_seq4_rec1
num_bytes: 652802040
num_examples: 68
- name: loc4_script3_seq2_rec2
num_bytes: 86400270
num_examples: 9
- name: loc2_script3_seq1_rec2
num_bytes: 48000150
num_examples: 5
- name: loc4_script1_seq1_rec1
num_bytes: 249600780
num_examples: 26
- name: loc5_script5_seq6_rec1
num_bytes: 662402070
num_examples: 69
- name: loc3_script5_seq1_rec1
num_bytes: 124800390
num_examples: 13
- name: loc1_script3_seq1_rec1
num_bytes: 96000300
num_examples: 10
- name: loc2_script2_seq3_rec1
num_bytes: 115200360
num_examples: 12
- name: loc5_script5_seq5_rec1
num_bytes: 672002100
num_examples: 70
- name: loc3_script1_seq3_rec1
num_bytes: 393601230
num_examples: 41
- name: loc1_script5_seq1_rec1
num_bytes: 67200210
num_examples: 7
- name: loc3_script4_seq3_rec1
num_bytes: 307200960
num_examples: 32
- name: loc2_script2_seq1_rec1
num_bytes: 201600630
num_examples: 21
- name: loc2_script5_seq1_rec1
num_bytes: 172800540
num_examples: 18
- name: loc1_script2_seq1_rec2
num_bytes: 182400570
num_examples: 19
- name: loc5_script5_seq2_rec1
num_bytes: 249600780
num_examples: 26
- name: loc4_script2_seq8_rec2
num_bytes: 86400270
num_examples: 9
download_size: 29983014860
dataset_size: 39312122850
configs:
- config_name: default
data_files:
- split: loc3_script4_seq4_rec1
path: data/loc3_script4_seq4_rec1-*
- split: loc4_script1_seq3_rec1
path: data/loc4_script1_seq3_rec1-*
- split: loc4_script2_seq7_rec1
path: data/loc4_script2_seq7_rec1-*
- split: loc5_script5_seq7_rec1
path: data/loc5_script5_seq7_rec1-*
- split: loc3_script2_seq5_rec1
path: data/loc3_script2_seq5_rec1-*
- split: loc3_script5_seq7_rec1
path: data/loc3_script5_seq7_rec1-*
- split: loc2_script2_seq5_rec2
path: data/loc2_script2_seq5_rec2-*
- split: loc1_script1_seq5_rec1
path: data/loc1_script1_seq5_rec1-*
- split: loc3_script2_seq4_rec2
path: data/loc3_script2_seq4_rec2-*
- split: loc3_script1_seq6_rec1
path: data/loc3_script1_seq6_rec1-*
- split: loc4_script3_seq1_rec2
path: data/loc4_script3_seq1_rec2-*
- split: loc2_script2_seq8_rec2
path: data/loc2_script2_seq8_rec2-*
- split: loc2_script1_seq1_rec1
path: data/loc2_script1_seq1_rec1-*
- split: loc4_script2_seq6_rec1
path: data/loc4_script2_seq6_rec1-*
- split: loc1_script1_seq7_rec1
path: data/loc1_script1_seq7_rec1-*
- split: loc2_script4_seq3_rec1
path: data/loc2_script4_seq3_rec1-*
- split: loc3_script2_seq3_rec2
path: data/loc3_script2_seq3_rec2-*
- split: loc3_script3_seq5_rec2
path: data/loc3_script3_seq5_rec2-*
- split: loc1_script2_seq8_rec2
path: data/loc1_script2_seq8_rec2-*
- split: loc1_script4_seq2_rec1
path: data/loc1_script4_seq2_rec1-*
- split: loc3_script3_seq4_rec1
path: data/loc3_script3_seq4_rec1-*
- split: loc3_script3_seq2_rec2
path: data/loc3_script3_seq2_rec2-*
- split: loc1_script1_seq3_rec1
path: data/loc1_script1_seq3_rec1-*
- split: loc3_script3_seq1_rec1
path: data/loc3_script3_seq1_rec1-*
- split: loc1_script2_seq3_rec2
path: data/loc1_script2_seq3_rec2-*
- split: loc1_script2_seq7_rec1
path: data/loc1_script2_seq7_rec1-*
- split: loc2_script2_seq4_rec1
path: data/loc2_script2_seq4_rec1-*
- split: loc1_script2_seq8_rec1
path: data/loc1_script2_seq8_rec1-*
- split: loc2_script2_seq1_rec2
path: data/loc2_script2_seq1_rec2-*
- split: loc5_script5_seq1_rec1
path: data/loc5_script5_seq1_rec1-*
- split: loc2_script5_seq4_rec1
path: data/loc2_script5_seq4_rec1-*
- split: loc2_script2_seq2_rec1
path: data/loc2_script2_seq2_rec1-*
- split: loc4_script5_seq3_rec1
path: data/loc4_script5_seq3_rec1-*
- split: loc2_script5_seq3_rec1
path: data/loc2_script5_seq3_rec1-*
- split: loc1_script2_seq4_rec2
path: data/loc1_script2_seq4_rec2-*
- split: loc2_script2_seq2_rec2
path: data/loc2_script2_seq2_rec2-*
- split: loc1_script4_seq4_rec1
path: data/loc1_script4_seq4_rec1-*
- split: loc2_script1_seq2_rec1
path: data/loc2_script1_seq2_rec1-*
- split: loc3_script2_seq3_rec1
path: data/loc3_script2_seq3_rec1-*
- split: loc2_script3_seq4_rec2
path: data/loc2_script3_seq4_rec2-*
- split: loc3_script2_seq1_rec2
path: data/loc3_script2_seq1_rec2-*
- split: loc2_script2_seq5_rec1
path: data/loc2_script2_seq5_rec1-*
- split: loc1_script3_seq5_rec1
path: data/loc1_script3_seq5_rec1-*
- split: loc3_script5_seq5_rec1
path: data/loc3_script5_seq5_rec1-*
- split: loc2_script5_seq7_rec1
path: data/loc2_script5_seq7_rec1-*
- split: loc2_script5_seq5_rec1
path: data/loc2_script5_seq5_rec1-*
- split: loc4_script3_seq4_rec1
path: data/loc4_script3_seq4_rec1-*
- split: loc5_script4_seq4_rec1
path: data/loc5_script4_seq4_rec1-*
- split: loc4_script2_seq1_rec2
path: data/loc4_script2_seq1_rec2-*
- split: loc3_script4_seq5_rec1
path: data/loc3_script4_seq5_rec1-*
- split: loc4_script2_seq3_rec2
path: data/loc4_script2_seq3_rec2-*
- split: loc2_script2_seq3_rec2
path: data/loc2_script2_seq3_rec2-*
- split: loc1_script2_seq6_rec2
path: data/loc1_script2_seq6_rec2-*
- split: loc1_script5_seq3_rec1
path: data/loc1_script5_seq3_rec1-*
- split: loc3_script2_seq7_rec2
path: data/loc3_script2_seq7_rec2-*
- split: loc2_script1_seq5_rec1
path: data/loc2_script1_seq5_rec1-*
- split: loc1_script5_seq6_rec1
path: data/loc1_script5_seq6_rec1-*
- split: loc3_script2_seq5_rec2
path: data/loc3_script2_seq5_rec2-*
- split: loc3_script1_seq7_rec1
path: data/loc3_script1_seq7_rec1-*
- split: loc1_script2_seq6_rec1
path: data/loc1_script2_seq6_rec1-*
- split: loc1_script5_seq2_rec1
path: data/loc1_script5_seq2_rec1-*
- split: loc5_script4_seq6_rec1
path: data/loc5_script4_seq6_rec1-*
- split: loc2_script5_seq2_rec1
path: data/loc2_script5_seq2_rec1-*
- split: loc4_script2_seq2_rec1
path: data/loc4_script2_seq2_rec1-*
- split: loc2_script4_seq7_rec1
path: data/loc2_script4_seq7_rec1-*
- split: loc4_script1_seq6_rec1
path: data/loc4_script1_seq6_rec1-*
- split: loc2_script5_seq6_rec1
path: data/loc2_script5_seq6_rec1-*
- split: loc3_script4_seq7_rec1
path: data/loc3_script4_seq7_rec1-*
- split: loc4_script2_seq4_rec1
path: data/loc4_script2_seq4_rec1-*
- split: loc5_script4_seq5_rec1
path: data/loc5_script4_seq5_rec1-*
- split: loc3_script2_seq4_rec1
path: data/loc3_script2_seq4_rec1-*
- split: loc1_script1_seq6_rec1
path: data/loc1_script1_seq6_rec1-*
- split: loc2_script1_seq4_rec1
path: data/loc2_script1_seq4_rec1-*
- split: loc1_script2_seq1_rec1
path: data/loc1_script2_seq1_rec1-*
- split: loc1_script2_seq4_rec1
path: data/loc1_script2_seq4_rec1-*
- split: loc4_script1_seq5_rec1
path: data/loc4_script1_seq5_rec1-*
- split: loc3_script4_seq2_rec1
path: data/loc3_script4_seq2_rec1-*
- split: loc3_script3_seq1_rec2
path: data/loc3_script3_seq1_rec2-*
- split: loc5_script4_seq1_rec1
path: data/loc5_script4_seq1_rec1-*
- split: loc4_script4_seq2_rec1
path: data/loc4_script4_seq2_rec1-*
- split: loc1_script4_seq3_rec1
path: data/loc1_script4_seq3_rec1-*
- split: loc4_script5_seq1_rec1
path: data/loc4_script5_seq1_rec1-*
- split: loc2_script1_seq3_rec1
path: data/loc2_script1_seq3_rec1-*
- split: loc2_script3_seq4_rec1
path: data/loc2_script3_seq4_rec1-*
- split: loc2_script2_seq4_rec2
path: data/loc2_script2_seq4_rec2-*
- split: loc5_script5_seq4_rec1
path: data/loc5_script5_seq4_rec1-*
- split: loc2_script4_seq5_rec1
path: data/loc2_script4_seq5_rec1-*
- split: loc2_script4_seq4_rec1
path: data/loc2_script4_seq4_rec1-*
- split: loc3_script1_seq1_rec1
path: data/loc3_script1_seq1_rec1-*
- split: loc2_script3_seq2_rec1
path: data/loc2_script3_seq2_rec1-*
- split: loc2_script1_seq6_rec1
path: data/loc2_script1_seq6_rec1-*
- split: loc5_script4_seq3_rec1
path: data/loc5_script4_seq3_rec1-*
- split: loc3_script1_seq2_rec1
path: data/loc3_script1_seq2_rec1-*
- split: loc3_script2_seq1_rec1
path: data/loc3_script2_seq1_rec1-*
- split: loc2_script3_seq5_rec2
path: data/loc2_script3_seq5_rec2-*
- split: loc3_script1_seq5_rec1
path: data/loc3_script1_seq5_rec1-*
- split: loc1_script2_seq3_rec1
path: data/loc1_script2_seq3_rec1-*
- split: loc3_script3_seq4_rec2
path: data/loc3_script3_seq4_rec2-*
- split: loc2_script3_seq3_rec2
path: data/loc2_script3_seq3_rec2-*
- split: loc2_script1_seq7_rec1
path: data/loc2_script1_seq7_rec1-*
- split: loc4_script5_seq7_rec1
path: data/loc4_script5_seq7_rec1-*
- split: loc2_script2_seq6_rec2
path: data/loc2_script2_seq6_rec2-*
- split: loc3_script2_seq2_rec1
path: data/loc3_script2_seq2_rec1-*
- split: loc1_script3_seq2_rec1
path: data/loc1_script3_seq2_rec1-*
- split: loc5_script4_seq2_rec1
path: data/loc5_script4_seq2_rec1-*
- split: loc4_script3_seq3_rec1
path: data/loc4_script3_seq3_rec1-*
- split: loc2_script2_seq8_rec1
path: data/loc2_script2_seq8_rec1-*
- split: loc1_script4_seq5_rec1
path: data/loc1_script4_seq5_rec1-*
- split: loc3_script3_seq2_rec1
path: data/loc3_script3_seq2_rec1-*
- split: loc2_script2_seq6_rec1
path: data/loc2_script2_seq6_rec1-*
- split: loc2_script3_seq2_rec2
path: data/loc2_script3_seq2_rec2-*
- split: loc2_script3_seq5_rec1
path: data/loc2_script3_seq5_rec1-*
- split: loc2_script3_seq1_rec1
path: data/loc2_script3_seq1_rec1-*
- split: loc3_script5_seq6_rec1
path: data/loc3_script5_seq6_rec1-*
- split: loc3_script3_seq5_rec1
path: data/loc3_script3_seq5_rec1-*
- split: loc1_script5_seq5_rec1
path: data/loc1_script5_seq5_rec1-*
- split: loc5_script5_seq3_rec1
path: data/loc5_script5_seq3_rec1-*
- split: loc3_script5_seq4_rec1
path: data/loc3_script5_seq4_rec1-*
- split: loc2_script3_seq3_rec1
path: data/loc2_script3_seq3_rec1-*
- split: loc3_script5_seq3_rec1
path: data/loc3_script5_seq3_rec1-*
- split: loc3_script2_seq7_rec1
path: data/loc3_script2_seq7_rec1-*
- split: loc1_script1_seq1_rec1
path: data/loc1_script1_seq1_rec1-*
- split: loc3_script5_seq2_rec1
path: data/loc3_script5_seq2_rec1-*
- split: loc3_script1_seq4_rec1
path: data/loc3_script1_seq4_rec1-*
- split: loc4_script3_seq2_rec2
path: data/loc4_script3_seq2_rec2-*
- split: loc2_script3_seq1_rec2
path: data/loc2_script3_seq1_rec2-*
- split: loc4_script1_seq1_rec1
path: data/loc4_script1_seq1_rec1-*
- split: loc5_script5_seq6_rec1
path: data/loc5_script5_seq6_rec1-*
- split: loc3_script5_seq1_rec1
path: data/loc3_script5_seq1_rec1-*
- split: loc1_script3_seq1_rec1
path: data/loc1_script3_seq1_rec1-*
- split: loc2_script2_seq3_rec1
path: data/loc2_script2_seq3_rec1-*
- split: loc5_script5_seq5_rec1
path: data/loc5_script5_seq5_rec1-*
- split: loc3_script1_seq3_rec1
path: data/loc3_script1_seq3_rec1-*
- split: loc1_script5_seq1_rec1
path: data/loc1_script5_seq1_rec1-*
- split: loc3_script4_seq3_rec1
path: data/loc3_script4_seq3_rec1-*
- split: loc2_script2_seq1_rec1
path: data/loc2_script2_seq1_rec1-*
- split: loc2_script5_seq1_rec1
path: data/loc2_script5_seq1_rec1-*
- split: loc1_script2_seq1_rec2
path: data/loc1_script2_seq1_rec2-*
- split: loc5_script5_seq2_rec1
path: data/loc5_script5_seq2_rec1-*
- split: loc4_script2_seq8_rec2
path: data/loc4_script2_seq8_rec2-*
---
|
open-llm-leaderboard/details_Weyaxi__Samantha-Nebula-7B | ---
pretty_name: Evaluation run of Weyaxi/Samantha-Nebula-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/Samantha-Nebula-7B](https://huggingface.co/Weyaxi/Samantha-Nebula-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Samantha-Nebula-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T22:52:33.668661](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Samantha-Nebula-7B/blob/main/results_2023-10-24T22-52-33.668661.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3792994966442953,\n\
\ \"em_stderr\": 0.004969032454438954,\n \"f1\": 0.4256501677852355,\n\
\ \"f1_stderr\": 0.0048455756354128885,\n \"acc\": 0.42229140848972546,\n\
\ \"acc_stderr\": 0.010604861041151385\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3792994966442953,\n \"em_stderr\": 0.004969032454438954,\n\
\ \"f1\": 0.4256501677852355,\n \"f1_stderr\": 0.0048455756354128885\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11372251705837756,\n \
\ \"acc_stderr\": 0.008744810131034036\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7308602999210734,\n \"acc_stderr\": 0.012464911951268734\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/Samantha-Nebula-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T22_52_33.668661
path:
- '**/details_harness|drop|3_2023-10-24T22-52-33.668661.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T22-52-33.668661.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T22_52_33.668661
path:
- '**/details_harness|gsm8k|5_2023-10-24T22-52-33.668661.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T22-52-33.668661.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-36-46.129297.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-36-46.129297.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-09T12-36-46.129297.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T22_52_33.668661
path:
- '**/details_harness|winogrande|5_2023-10-24T22-52-33.668661.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T22-52-33.668661.parquet'
- config_name: results
data_files:
- split: 2023_10_09T12_36_46.129297
path:
- results_2023-10-09T12-36-46.129297.parquet
- split: 2023_10_24T22_52_33.668661
path:
- results_2023-10-24T22-52-33.668661.parquet
- split: latest
path:
- results_2023-10-24T22-52-33.668661.parquet
---
# Dataset Card for Evaluation run of Weyaxi/Samantha-Nebula-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/Samantha-Nebula-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/Samantha-Nebula-7B](https://huggingface.co/Weyaxi/Samantha-Nebula-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Samantha-Nebula-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T22:52:33.668661](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Samantha-Nebula-7B/blob/main/results_2023-10-24T22-52-33.668661.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3792994966442953,
"em_stderr": 0.004969032454438954,
"f1": 0.4256501677852355,
"f1_stderr": 0.0048455756354128885,
"acc": 0.42229140848972546,
"acc_stderr": 0.010604861041151385
},
"harness|drop|3": {
"em": 0.3792994966442953,
"em_stderr": 0.004969032454438954,
"f1": 0.4256501677852355,
"f1_stderr": 0.0048455756354128885
},
"harness|gsm8k|5": {
"acc": 0.11372251705837756,
"acc_stderr": 0.008744810131034036
},
"harness|winogrande|5": {
"acc": 0.7308602999210734,
"acc_stderr": 0.012464911951268734
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kpriyanshu256/MultiTabQA-multitable_pretraining-train-v2-47000 | ---
dataset_info:
features:
- name: tables
sequence: string
- name: table_names
sequence: string
- name: query
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: source_latex
dtype: string
- name: target_latex
dtype: string
- name: source_html
dtype: string
- name: target_html
dtype: string
- name: source_markdown
dtype: string
- name: target_markdown
dtype: string
splits:
- name: train
num_bytes: 3194995307
num_examples: 500
download_size: 662432549
dataset_size: 3194995307
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
globalids/Pharma | ---
license: unknown
---
|
Shivam22182/model | ---
license: unknown
task_categories:
- question-answering
tags:
- code
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
PlinStudios/plynkz | ---
license: cc
---
|
chrisgg1/keywords_verbinden6 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: label
dtype:
class_label:
names:
'0': eins
'1': ja
'2': nein
'3': verbinden
splits:
- name: train
num_bytes: 1036620540.45
num_examples: 7981
download_size: 592054581
dataset_size: 1036620540.45
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Falah/desert_arabic_fashion_SDXL_refiner_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 1940664703
num_examples: 2000000
download_size: 192665123
dataset_size: 1940664703
---
# Dataset Card for "desert_arabic_fashion_SDXL_refiner_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Atipico1/trivia-top5 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: id
dtype: string
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 36187270.292568386
num_examples: 10000
- name: test
num_bytes: 41019784
num_examples: 11313
- name: validation
num_bytes: 32039567
num_examples: 8837
download_size: 66258055
dataset_size: 109246621.29256839
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_MisterRid__wendigo-14b-alpha2 | ---
pretty_name: Evaluation run of MisterRid/wendigo-14b-alpha2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MisterRid/wendigo-14b-alpha2](https://huggingface.co/MisterRid/wendigo-14b-alpha2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MisterRid__wendigo-14b-alpha2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-18T06:03:21.055340](https://huggingface.co/datasets/open-llm-leaderboard/details_MisterRid__wendigo-14b-alpha2/blob/main/results_2023-12-18T06-03-21.055340.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5760376255323894,\n\
\ \"acc_stderr\": 0.03389255049926726,\n \"acc_norm\": 0.5830693356885244,\n\
\ \"acc_norm_stderr\": 0.03462115663481434,\n \"mc1\": 0.3929008567931457,\n\
\ \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.5371025434721111,\n\
\ \"mc2_stderr\": 0.015786315933755037\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5290102389078498,\n \"acc_stderr\": 0.014586776355294321,\n\
\ \"acc_norm\": 0.5665529010238908,\n \"acc_norm_stderr\": 0.014481376224558902\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5812587134037045,\n\
\ \"acc_stderr\": 0.004923445627861517,\n \"acc_norm\": 0.77185819557857,\n\
\ \"acc_norm_stderr\": 0.004187768949417078\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\"\
: 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6870967741935484,\n\
\ \"acc_stderr\": 0.02637756702864586,\n \"acc_norm\": 0.6870967741935484,\n\
\ \"acc_norm_stderr\": 0.02637756702864586\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.0342239856565755,\n\
\ \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.0342239856565755\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7823834196891192,\n \"acc_stderr\": 0.02977866303775295,\n\
\ \"acc_norm\": 0.7823834196891192,\n \"acc_norm_stderr\": 0.02977866303775295\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n\
\ \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096626,\n \
\ \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8128440366972477,\n \"acc_stderr\": 0.01672268452620014,\n \"\
acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.01672268452620014\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.0309645179269234,\n \"acc_norm\"\
: 0.7352941176470589,\n \"acc_norm_stderr\": 0.0309645179269234\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.6624472573839663,\n \"acc_stderr\": 0.03078154910202622,\n \"\
acc_norm\": 0.6624472573839663,\n \"acc_norm_stderr\": 0.03078154910202622\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709698,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709698\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046735,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046735\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.04742762361243011,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.04742762361243011\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335445,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335445\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n\
\ \"acc_stderr\": 0.014987270640946012,\n \"acc_norm\": 0.7726692209450831,\n\
\ \"acc_norm_stderr\": 0.014987270640946012\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5838150289017341,\n \"acc_stderr\": 0.02653818910470547,\n\
\ \"acc_norm\": 0.5838150289017341,\n \"acc_norm_stderr\": 0.02653818910470547\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32849162011173183,\n\
\ \"acc_stderr\": 0.01570793539849645,\n \"acc_norm\": 0.32849162011173183,\n\
\ \"acc_norm_stderr\": 0.01570793539849645\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.02763417668960266,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.02763417668960266\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\
\ \"acc_stderr\": 0.026858825879488533,\n \"acc_norm\": 0.662379421221865,\n\
\ \"acc_norm_stderr\": 0.026858825879488533\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n\
\ \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596147,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596147\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39048239895697523,\n\
\ \"acc_stderr\": 0.012460135913945077,\n \"acc_norm\": 0.39048239895697523,\n\
\ \"acc_norm_stderr\": 0.012460135913945077\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.029289413409403192,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.029289413409403192\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6078431372549019,\n \"acc_stderr\": 0.019751726508762637,\n \
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.019751726508762637\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670238,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670238\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5795918367346938,\n \"acc_stderr\": 0.03160106993449601,\n\
\ \"acc_norm\": 0.5795918367346938,\n \"acc_norm_stderr\": 0.03160106993449601\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.03187187537919795,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.03187187537919795\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542126,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542126\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.0330140594698725,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.0330140594698725\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3929008567931457,\n\
\ \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.5371025434721111,\n\
\ \"mc2_stderr\": 0.015786315933755037\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7363851617995264,\n \"acc_stderr\": 0.012382849299658466\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.22896133434420016,\n \
\ \"acc_stderr\": 0.011573412892418219\n }\n}\n```"
repo_url: https://huggingface.co/MisterRid/wendigo-14b-alpha2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|arc:challenge|25_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|gsm8k|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hellaswag|10_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T06-03-21.055340.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T06-03-21.055340.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- '**/details_harness|winogrande|5_2023-12-18T06-03-21.055340.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-18T06-03-21.055340.parquet'
- config_name: results
data_files:
- split: 2023_12_18T06_03_21.055340
path:
- results_2023-12-18T06-03-21.055340.parquet
- split: latest
path:
- results_2023-12-18T06-03-21.055340.parquet
---
# Dataset Card for Evaluation run of MisterRid/wendigo-14b-alpha2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MisterRid/wendigo-14b-alpha2](https://huggingface.co/MisterRid/wendigo-14b-alpha2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MisterRid__wendigo-14b-alpha2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-18T06:03:21.055340](https://huggingface.co/datasets/open-llm-leaderboard/details_MisterRid__wendigo-14b-alpha2/blob/main/results_2023-12-18T06-03-21.055340.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5760376255323894,
"acc_stderr": 0.03389255049926726,
"acc_norm": 0.5830693356885244,
"acc_norm_stderr": 0.03462115663481434,
"mc1": 0.3929008567931457,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.5371025434721111,
"mc2_stderr": 0.015786315933755037
},
"harness|arc:challenge|25": {
"acc": 0.5290102389078498,
"acc_stderr": 0.014586776355294321,
"acc_norm": 0.5665529010238908,
"acc_norm_stderr": 0.014481376224558902
},
"harness|hellaswag|10": {
"acc": 0.5812587134037045,
"acc_stderr": 0.004923445627861517,
"acc_norm": 0.77185819557857,
"acc_norm_stderr": 0.004187768949417078
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.02490699045899257,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.02490699045899257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6870967741935484,
"acc_stderr": 0.02637756702864586,
"acc_norm": 0.6870967741935484,
"acc_norm_stderr": 0.02637756702864586
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.0342239856565755,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.0342239856565755
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7823834196891192,
"acc_stderr": 0.02977866303775295,
"acc_norm": 0.7823834196891192,
"acc_norm_stderr": 0.02977866303775295
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.03175367846096626,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.03175367846096626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8128440366972477,
"acc_stderr": 0.01672268452620014,
"acc_norm": 0.8128440366972477,
"acc_norm_stderr": 0.01672268452620014
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.0309645179269234,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.0309645179269234
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6624472573839663,
"acc_stderr": 0.03078154910202622,
"acc_norm": 0.6624472573839663,
"acc_norm_stderr": 0.03078154910202622
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709698,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709698
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046735,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046735
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.04742762361243011,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.04742762361243011
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335445,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335445
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.014987270640946012,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.014987270640946012
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.02653818910470547,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.02653818910470547
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32849162011173183,
"acc_stderr": 0.01570793539849645,
"acc_norm": 0.32849162011173183,
"acc_norm_stderr": 0.01570793539849645
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.02763417668960266,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.02763417668960266
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.026858825879488533,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.026858825879488533
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596147,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596147
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39048239895697523,
"acc_stderr": 0.012460135913945077,
"acc_norm": 0.39048239895697523,
"acc_norm_stderr": 0.012460135913945077
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.029289413409403192,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.029289413409403192
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.019751726508762637,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.019751726508762637
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670238,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670238
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5795918367346938,
"acc_stderr": 0.03160106993449601,
"acc_norm": 0.5795918367346938,
"acc_norm_stderr": 0.03160106993449601
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919795,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919795
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.0330140594698725,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.0330140594698725
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3929008567931457,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.5371025434721111,
"mc2_stderr": 0.015786315933755037
},
"harness|winogrande|5": {
"acc": 0.7363851617995264,
"acc_stderr": 0.012382849299658466
},
"harness|gsm8k|5": {
"acc": 0.22896133434420016,
"acc_stderr": 0.011573412892418219
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
P1ayer-1/college-texts-annas-v1 | ---
dataset_info:
features:
- name: author
dtype: int64
- name: cover_url
dtype: string
- name: date_added
dtype: string
- name: date_modified
dtype: string
- name: description
dtype: float64
- name: edition
dtype: int64
- name: extension
dtype: string
- name: filesize
dtype: string
- name: filesize_reported
dtype: string
- name: in_libgen
dtype: string
- name: language
dtype: string
- name: md5
dtype: string
- name: md5_reported
dtype: string
- name: pages
dtype: string
- name: pilimi_torrent
dtype: string
- name: publisher
dtype: string
- name: series
dtype: string
- name: title
dtype: string
- name: unavailable
dtype: string
- name: volume
dtype: int64
- name: year
dtype: string
- name: zlibrary_id
dtype: int64
splits:
- name: train
num_bytes: 43134412
num_examples: 43206
download_size: 20108980
dataset_size: 43134412
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "college-texts-annas-v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AgoraX/AIEC-140K | ---
license: mit
task_categories:
- text-classification
- table-question-answering
- question-answering
- conversational
tags:
- code
size_categories:
- 100K<n<1M
---
# AgoraX/AIEC-140K Dataset
===============================
Excited to Announce AgoraX/AIEC-140K!
An all-new dataset with super high High Quality AI Engineering Code Tokens totaling 140k samples!
## Introduction
------------
The AgoraX/AIEC-140K dataset is a collection of AI engineering code tokens from top research labs such as OpenAI, Nvidia, Google, Lucidrains, and others. These tokens have been scraped from various repositories on GitHub, providing a valuable resource for researchers and developers in the field of Artificial Intelligence.
This README file serves as a guide to understand the dataset and effectively utilize its contents.
## Dataset Details
---------------
- Dataset Name: AgoraX/AIEC-140K
- Total Samples: 140,000
### Data Format
The dataset primarily consists of code tokens, which are the atomic units of code. Each code token is a single word or a character representing a meaningful entity in AI engineering code. These tokens were collected from different repositories, ensuring a diverse collection of samples.
The data does not include complete code snippets or files but focuses on individual tokens to enable easy integration and usage in various downstream tasks.
### Data Sources
Code tokens in the AgoraX/AIEC-140K dataset are scraped from various repositories on GitHub. Prominent research labs including OpenAI, Nvidia, Google, Lucidrains, and others have contributed to this dataset.
Please note that the dataset does not provide details on the exact repositories or sources from where each token is scraped.
### Usage
The AgoraX/AIEC-140K dataset is a valuable resource for researchers, developers, and practitioners in the field of AI engineering. The dataset can be utilized for various purposes, including but not limited to:
- Training language models for code generation
- Pre-training and fine-tuning neural networks
- Code completion and suggestion systems
- Understanding and analyzing code patterns and trends in AI engineering
# Citation
--------
If you use the AgoraX/AIEC-140K dataset in your research work, please consider citing it using the following BibTeX:
```
@dataset{agorax/aiec140k,
author = {AgoraX Team},
title = {AgoraX/AIEC-140K Dataset},
year = {2022},
publisher = {Hugging Face},
url = {https://huggingface.co/datasets/agorax/aiec-140k}
}
```
# License
-------
The AgoraX/AIEC-140K dataset is released under the [MIT License](https://opensource.org/licenses/MIT). Please refer to the LICENSE file in the dataset repository for more details.
# Contact
-------
For any further inquiries or feedback regarding the dataset, please contact the AgoraX Team in the discord: https://discord.gg/t8SWA2CnVN
We appreciate your interest and hope that the AgoraX/AIEC-140K dataset proves to be a valuable asset in advancing AI engineering research and development. |
Nexdata/500_Hours_Brazilian_Portuguese_Spontaneous_Speech_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
Portuguese(Brazil) Real-world Casual Conversation and Monologue speech dataset, covers self-media, conversation, live and other generic domains, mirrors real-world interactions. Transcribed with text content, speaker's ID, gender and other attributes. Our dataset was collected from extensive and diversify speakers, geographicly speaking, enhancing model performance in real and complex tasks. Quality tested by various AI companies. We strictly adhere to data protection regulations and privacy standards, ensuring the maintenance of user privacy and legal rights throughout the data collection, storage, and usage processes, our datasets are all GDPR, CCPA, PIPL complied.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1334?source=Huggingface
## Format
16kHz, 16 bit, wav, mono channel
## Content category
including interview, self-meida,variety show, etc.
## Recording environment
Low background noise
## Country
Brazil(BRA)
## Language(Region) Code
pt-BR
## Language
Portuguese
## Features of annotation
Transcription text, timestamp, speaker ID, gender, noise
## Accuracy
Word Accuracy Rate (WAR) 98%
# Licensing Information
Commercial License
|
open-llm-leaderboard/details_MaziyarPanahi__YamshadowInex12_Multi_verse_modelExperiment28 | ---
pretty_name: Evaluation run of MaziyarPanahi/YamshadowInex12_Multi_verse_modelExperiment28
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MaziyarPanahi/YamshadowInex12_Multi_verse_modelExperiment28](https://huggingface.co/MaziyarPanahi/YamshadowInex12_Multi_verse_modelExperiment28)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__YamshadowInex12_Multi_verse_modelExperiment28\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T10:27:29.245074](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__YamshadowInex12_Multi_verse_modelExperiment28/blob/main/results_2024-04-09T10-27-29.245074.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6511927726361287,\n\
\ \"acc_stderr\": 0.03206620053671584,\n \"acc_norm\": 0.6502551914394495,\n\
\ \"acc_norm_stderr\": 0.03274217671140933,\n \"mc1\": 0.6352509179926561,\n\
\ \"mc1_stderr\": 0.01685096106172013,\n \"mc2\": 0.781362060720167,\n\
\ \"mc2_stderr\": 0.013641491113312233\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7150170648464164,\n \"acc_stderr\": 0.013191348179838793,\n\
\ \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710696\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.716391157140012,\n\
\ \"acc_stderr\": 0.004498280244494491,\n \"acc_norm\": 0.8916550487950607,\n\
\ \"acc_norm_stderr\": 0.0031018035745563116\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268545,\n \"\
acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268545\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066485,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903343,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903343\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n\
\ \"acc_stderr\": 0.016558601636041035,\n \"acc_norm\": 0.4301675977653631,\n\
\ \"acc_norm_stderr\": 0.016558601636041035\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n\
\ \"acc_stderr\": 0.01275197796767601,\n \"acc_norm\": 0.47327249022164275,\n\
\ \"acc_norm_stderr\": 0.01275197796767601\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6352509179926561,\n\
\ \"mc1_stderr\": 0.01685096106172013,\n \"mc2\": 0.781362060720167,\n\
\ \"mc2_stderr\": 0.013641491113312233\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8531965272296764,\n \"acc_stderr\": 0.009946627440250677\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6959818043972706,\n \
\ \"acc_stderr\": 0.012670420440198669\n }\n}\n```"
repo_url: https://huggingface.co/MaziyarPanahi/YamshadowInex12_Multi_verse_modelExperiment28
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|arc:challenge|25_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|gsm8k|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hellaswag|10_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-27-29.245074.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T10-27-29.245074.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- '**/details_harness|winogrande|5_2024-04-09T10-27-29.245074.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T10-27-29.245074.parquet'
- config_name: results
data_files:
- split: 2024_04_09T10_27_29.245074
path:
- results_2024-04-09T10-27-29.245074.parquet
- split: latest
path:
- results_2024-04-09T10-27-29.245074.parquet
---
# Dataset Card for Evaluation run of MaziyarPanahi/YamshadowInex12_Multi_verse_modelExperiment28
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/YamshadowInex12_Multi_verse_modelExperiment28](https://huggingface.co/MaziyarPanahi/YamshadowInex12_Multi_verse_modelExperiment28) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__YamshadowInex12_Multi_verse_modelExperiment28",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T10:27:29.245074](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__YamshadowInex12_Multi_verse_modelExperiment28/blob/main/results_2024-04-09T10-27-29.245074.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6511927726361287,
"acc_stderr": 0.03206620053671584,
"acc_norm": 0.6502551914394495,
"acc_norm_stderr": 0.03274217671140933,
"mc1": 0.6352509179926561,
"mc1_stderr": 0.01685096106172013,
"mc2": 0.781362060720167,
"mc2_stderr": 0.013641491113312233
},
"harness|arc:challenge|25": {
"acc": 0.7150170648464164,
"acc_stderr": 0.013191348179838793,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710696
},
"harness|hellaswag|10": {
"acc": 0.716391157140012,
"acc_stderr": 0.004498280244494491,
"acc_norm": 0.8916550487950607,
"acc_norm_stderr": 0.0031018035745563116
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066485,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903343,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903343
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4301675977653631,
"acc_stderr": 0.016558601636041035,
"acc_norm": 0.4301675977653631,
"acc_norm_stderr": 0.016558601636041035
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.01275197796767601,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.01275197796767601
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6352509179926561,
"mc1_stderr": 0.01685096106172013,
"mc2": 0.781362060720167,
"mc2_stderr": 0.013641491113312233
},
"harness|winogrande|5": {
"acc": 0.8531965272296764,
"acc_stderr": 0.009946627440250677
},
"harness|gsm8k|5": {
"acc": 0.6959818043972706,
"acc_stderr": 0.012670420440198669
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
reubenhead/phoney-pii-en | ---
license: apache-2.0
---
|
jondurbin/airoboros-gpt4-1.3 | ---
license: cc-by-nc-4.0
---
## Overview
A continuation of [gpt4-1.2](https://huggingface.co/datasets/jondurbin/airoboros-gpt4-1.2), with:
* all coding instructions now have an equivalent "PLAINFORMAT" version
* several thousand new orca style prompts, this time with reasoning first, then response
* several examples of conversational/character interactions, with asterisk'd actions and quoted dialog
_*Note: I did not filter by token length for this dataset, some are well over 2048 so use carefully.*_
### Usage and License Notices
All airoboros models and datasets are intended and licensed for research use only. I've used the 'cc-nc-4.0' license, but really it is subject to a custom/special license because:
- the base model is LLaMa, which has it's own special research license
- the dataset(s) were generated with OpenAI (gpt-4 and/or gpt-3.5-turbo), which has a clausing saying the data can't be used to create models to compete with openai
So, to reiterate: this model (and datasets) cannot be used commercially. |
tyzhu/find_first_sent_train_100_eval_10_sentbefore | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 435057
num_examples: 320
- name: validation
num_bytes: 10399
num_examples: 10
download_size: 136011
dataset_size: 445456
---
# Dataset Card for "find_first_sent_train_100_eval_10_sentbefore"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/f3e48ad4 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 184
num_examples: 10
download_size: 1324
dataset_size: 184
---
# Dataset Card for "f3e48ad4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CJWeiss/govreport | ---
dataset_info:
features:
- name: report
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 799538925
num_examples: 14598
- name: test
num_bytes: 157374869
num_examples: 2919
- name: valid
num_bytes: 103818773
num_examples: 1946
download_size: 506671700
dataset_size: 1060732567
---
# Dataset Card for "govreport"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sg247/python-codes-25k-llama2 | ---
dataset_info:
features:
- name: output
dtype: string
- name: text
dtype: string
- name: input
dtype: string
- name: instruction
dtype: string
- name: formatted_text
dtype: string
splits:
- name: train
num_bytes: 70955956
num_examples: 49626
download_size: 33910869
dataset_size: 70955956
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SEACrowd/su_id_tts | ---
tags:
- text-to-speech
language:
- sun
---
# su_id_tts
This data set contains high-quality transcribed audio data for Sundanese. The data set consists of wave files, and a TSV file. The file line_index.tsv contains a filename and the transcription of audio in the file. Each filename is prepended with a speaker identification number.
The data set has been manually quality checked, but there might still be errors.
This dataset was collected by Google in collaboration with Universitas Pendidikan Indonesia.
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@inproceedings{sodimana18_sltu,
author={Keshan Sodimana and Pasindu {De Silva} and Supheakmungkol Sarin and Oddur Kjartansson and Martin Jansche and Knot Pipatsrisawat and Linne Ha},
title={{A Step-by-Step Process for Building TTS Voices Using Open Source Data and Frameworks for Bangla, Javanese, Khmer, Nepali, Sinhala, and Sundanese}},
year=2018,
booktitle={Proc. 6th Workshop on Spoken Language Technologies for Under-Resourced Languages (SLTU 2018)},
pages={66--70},
doi={10.21437/SLTU.2018-14}
}
```
## License
CC BY-SA 4.0
## Homepage
[http://openslr.org/44/](http://openslr.org/44/)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
freshpearYoon/exp2 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 19209737448
num_examples: 20000
- name: valid
num_bytes: 5676611352
num_examples: 5910
download_size: 3976627266
dataset_size: 24886348800
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
---
|
TinyPixel/s_1 | ---
dataset_info:
features:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 37364158
num_examples: 138748
download_size: 19180486
dataset_size: 37364158
---
# Dataset Card for "wizard_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mahdibaghbanzadeh/GUE_mouse_2 | ---
dataset_info:
features:
- name: sequence
dtype: string
- name: labels
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: train
num_bytes: 296060
num_examples: 2620
- name: val
num_bytes: 37064
num_examples: 328
- name: test
num_bytes: 37064
num_examples: 328
download_size: 157789
dataset_size: 370188
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
Rhma/Multitarget-CONAN | ---
dataset_info:
features:
- name: INDEX
dtype: int64
- name: HATE_SPEECH
dtype: string
- name: COUNTER_NARRATIVE
dtype: string
- name: TARGET
dtype: string
- name: VERSION
dtype: string
splits:
- name: train
num_bytes: 874625.724965021
num_examples: 3502
- name: test
num_bytes: 374875.275034979
num_examples: 1501
download_size: 687455
dataset_size: 1249501.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
autoevaluate/autoeval-eval-multi_news-default-e22c67-2252871793 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- multi_news
eval_info:
task: summarization
model: pszemraj/led-base-book-summary
metrics: []
dataset_name: multi_news
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/led-base-book-summary
* Dataset: multi_news
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
liuyanchen1015/MULTI_VALUE_mnli_demonstrative_for_definite_articles | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 1351555
num_examples: 5851
- name: dev_mismatched
num_bytes: 1439245
num_examples: 6018
- name: test_matched
num_bytes: 1364956
num_examples: 5910
- name: test_mismatched
num_bytes: 1412515
num_examples: 5914
- name: train
num_bytes: 54809370
num_examples: 235325
download_size: 39449978
dataset_size: 60377641
---
# Dataset Card for "MULTI_VALUE_mnli_demonstrative_for_definite_articles"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jeffzrraa/Musculoso | ---
license: openrail
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.