datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
matjs/pt_to_an | ---
license: mit
task_categories:
- translation
language:
- pt
pretty_name: PT-AN
size_categories:
- 1K<n<10K
---
A collection of translations from Portuguese do Angrarosskesh, my fictional language. |
tyzhu/random_letter_same_length_find_passage_train200_eval40_title | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 151441
num_examples: 440
- name: validation
num_bytes: 16031
num_examples: 40
download_size: 81084
dataset_size: 167472
---
# Dataset Card for "random_letter_same_length_find_passage_train200_eval40_title"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-13B | ---
pretty_name: Evaluation run of garage-bAInd/Camel-Platypus2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [garage-bAInd/Camel-Platypus2-13B](https://huggingface.co/garage-bAInd/Camel-Platypus2-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-13T04:35:13.977731](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-13B/blob/main/results_2023-10-13T04-35-13.977731.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3248741610738255,\n\
\ \"em_stderr\": 0.004796115152921962,\n \"f1\": 0.38906250000000175,\n\
\ \"f1_stderr\": 0.004663274154133875,\n \"acc\": 0.37725358176562207,\n\
\ \"acc_stderr\": 0.006433257710580032\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3248741610738255,\n \"em_stderr\": 0.004796115152921962,\n\
\ \"f1\": 0.38906250000000175,\n \"f1_stderr\": 0.004663274154133875\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \
\ \"acc_stderr\": 0.0007581501137225365\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7537490134175217,\n \"acc_stderr\": 0.012108365307437528\n\
\ }\n}\n```"
repo_url: https://huggingface.co/garage-bAInd/Camel-Platypus2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|arc:challenge|25_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_13T04_35_13.977731
path:
- '**/details_harness|drop|3_2023-10-13T04-35-13.977731.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-13T04-35-13.977731.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_13T04_35_13.977731
path:
- '**/details_harness|gsm8k|5_2023-10-13T04-35-13.977731.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-13T04-35-13.977731.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hellaswag|10_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T16:10:57.360881.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T16:10:57.360881.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T16:10:57.360881.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_13T04_35_13.977731
path:
- '**/details_harness|winogrande|5_2023-10-13T04-35-13.977731.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-13T04-35-13.977731.parquet'
- config_name: results
data_files:
- split: 2023_08_09T16_10_57.360881
path:
- results_2023-08-09T16:10:57.360881.parquet
- split: 2023_10_13T04_35_13.977731
path:
- results_2023-10-13T04-35-13.977731.parquet
- split: latest
path:
- results_2023-10-13T04-35-13.977731.parquet
---
# Dataset Card for Evaluation run of garage-bAInd/Camel-Platypus2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/garage-bAInd/Camel-Platypus2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [garage-bAInd/Camel-Platypus2-13B](https://huggingface.co/garage-bAInd/Camel-Platypus2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-13T04:35:13.977731](https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-13B/blob/main/results_2023-10-13T04-35-13.977731.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3248741610738255,
"em_stderr": 0.004796115152921962,
"f1": 0.38906250000000175,
"f1_stderr": 0.004663274154133875,
"acc": 0.37725358176562207,
"acc_stderr": 0.006433257710580032
},
"harness|drop|3": {
"em": 0.3248741610738255,
"em_stderr": 0.004796115152921962,
"f1": 0.38906250000000175,
"f1_stderr": 0.004663274154133875
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225365
},
"harness|winogrande|5": {
"acc": 0.7537490134175217,
"acc_stderr": 0.012108365307437528
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Sultu/Drake | ---
license: openrail
---
|
echarlaix/gqa-lxmert | ---
license: apache-2.0
---
|
Muhacker/Muhac | ---
license: other
---
|
freshpearYoon/train_free_53 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604498208
num_examples: 10000
download_size: 1134611842
dataset_size: 9604498208
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yvelos/python_code_instructions_18k_alpaca | ---
license: openrail
---
|
Teklia/NewsEye-Austrian-line | ---
license: mit
language:
- de
task_categories:
- image-to-text
pretty_name: NewsEye-Austrian-line
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_examples: 51588
- name: validation
num_examples: 4379
dataset_size: 55967
tags:
- atr
- htr
- ocr
- historical
- printed
---
# NewsEye Austrian - line level
## Table of Contents
- [NewsEye Austrian - line level](#newseye-austrian-line-level)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
## Dataset Description
- **Homepage:** [NewsEye project](https://www.newseye.eu/)
- **Source:** [Zenodo](https://zenodo.org/records/3387369)
- **Point of Contact:** [TEKLIA](https://teklia.com)
## Dataset Summary
The dataset comprises Austrian newspaper pages from 19th and early 20th century. The images were provided by the Austrian National Library.
### Languages
The documents are in Austrian German with the Fraktur font.
Note that all images are resized to a fixed height of 128 pixels.
## Dataset Structure
### Data Instances
```
{
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=4300x128 at 0x1A800E8E190,
'text': 'Mann; und als wir uns zum Angriff stark genug'
}
```
### Data Fields
- `image`: a PIL.Image.Image object containing the image. Note that when accessing the image column (using dataset[0]["image"]), the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the "image" column, i.e. dataset[0]["image"] should always be preferred over dataset["image"][0].
- `text`: the label transcription of the image. |
gxb912/large-twitter-tweets-sentiment | ---
license: mit
task_categories:
- text-classification
language:
- en
pretty_name: s
size_categories:
- 10M<n<100M
---
# Dataset Card for "Large twitter tweets sentiment analysis"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Splits and Size](#data-splits-and-size)
## Dataset Description
### Dataset Summary
This dataset is a collection of tweets formatted in a tabular data structure, annotated for sentiment analysis.
Each tweet is associated with a sentiment label, with `1` indicating a Positive sentiment and `0` for a Negative sentiment.
### Languages
The tweets in English.
## Dataset Structure
### Data Instances
An instance of the dataset includes the following fields:
- `text`: a string containing the tweet's content.
- `sentiment`: an integer where `1` indicates Positive sentiment and `0` indicates Negative sentiment.
### Data Splits and Size
The dataset is divided into training and test sets. The sizes are as follows:
- Training set: 179995 instances
- Test set: 44999 instances |
BeIR/scidocs | ---
annotations_creators: []
language_creators: []
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: beir
pretty_name: BEIR Benchmark
size_categories:
msmarco:
- 1M<n<10M
trec-covid:
- 100k<n<1M
nfcorpus:
- 1K<n<10K
nq:
- 1M<n<10M
hotpotqa:
- 1M<n<10M
fiqa:
- 10K<n<100K
arguana:
- 1K<n<10K
touche-2020:
- 100K<n<1M
cqadupstack:
- 100K<n<1M
quora:
- 100K<n<1M
dbpedia:
- 1M<n<10M
scidocs:
- 10K<n<100K
fever:
- 1M<n<10M
climate-fever:
- 1M<n<10M
scifact:
- 1K<n<10K
source_datasets: []
task_categories:
- text-retrieval
- zero-shot-retrieval
- information-retrieval
- zero-shot-information-retrieval
task_ids:
- passage-retrieval
- entity-linking-retrieval
- fact-checking-retrieval
- tweet-retrieval
- citation-prediction-retrieval
- duplication-question-retrieval
- argument-retrieval
- news-retrieval
- biomedical-information-retrieval
- question-answering-retrieval
---
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset. |
CrisPO/Demo_clase_platzi | ---
license: mit
---
|
bigheiniuJ/BBH_eval | ---
dataset_info:
features:
- name: input
dtype: string
- name: target
dtype: string
- name: task
dtype: string
- name: options
sequence: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2641563
num_examples: 4071
download_size: 570189
dataset_size: 2641563
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "BBH_eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alarmod/yolo_gestures | ---
license: gpl-3.0
---
Dataset for UAV control, containing gesture commands “take-off”, “landing”, “stop” and “return home”, used when controlling the UAV |
rabasi/marmail-demo | ---
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 10429
num_examples: 10
download_size: 14536
dataset_size: 10429
---
# Dataset Card for "marmail-demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TokenBender__pic_7B_mistral_Full_v0.1 | ---
pretty_name: Evaluation run of TokenBender/pic_7B_mistral_Full_v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TokenBender/pic_7B_mistral_Full_v0.1](https://huggingface.co/TokenBender/pic_7B_mistral_Full_v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TokenBender__pic_7B_mistral_Full_v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-08T01:00:48.190749](https://huggingface.co/datasets/open-llm-leaderboard/details_TokenBender__pic_7B_mistral_Full_v0.1/blob/main/results_2023-12-08T01-00-48.190749.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6335554932812488,\n\
\ \"acc_stderr\": 0.03234608898724019,\n \"acc_norm\": 0.6365587293846601,\n\
\ \"acc_norm_stderr\": 0.03299054248415427,\n \"mc1\": 0.379436964504284,\n\
\ \"mc1_stderr\": 0.016987039266142978,\n \"mc2\": 0.5451115248499341,\n\
\ \"mc2_stderr\": 0.015141183727073078\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6194539249146758,\n \"acc_stderr\": 0.014188277712349812,\n\
\ \"acc_norm\": 0.6390784982935154,\n \"acc_norm_stderr\": 0.014034761386175452\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6406094403505278,\n\
\ \"acc_stderr\": 0.004788412062375695,\n \"acc_norm\": 0.8369846644094802,\n\
\ \"acc_norm_stderr\": 0.0036862475593618534\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n\
\ \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155236,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155236\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.02275520495954294,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.02275520495954294\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.0291265228345868,\n \"acc_norm\"\
: 0.7878787878787878,\n \"acc_norm_stderr\": 0.0291265228345868\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \
\ \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228405,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n\
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"\
acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834838,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834838\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.025131000233647886,\n\
\ \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.025131000233647886\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35083798882681566,\n\
\ \"acc_stderr\": 0.01596103667523096,\n \"acc_norm\": 0.35083798882681566,\n\
\ \"acc_norm_stderr\": 0.01596103667523096\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.025557653981868062,\n\
\ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.025557653981868062\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766006,\n \
\ \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766006\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4530638852672751,\n\
\ \"acc_stderr\": 0.012713845972358983,\n \"acc_norm\": 0.4530638852672751,\n\
\ \"acc_norm_stderr\": 0.012713845972358983\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \
\ \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.379436964504284,\n\
\ \"mc1_stderr\": 0.016987039266142978,\n \"mc2\": 0.5451115248499341,\n\
\ \"mc2_stderr\": 0.015141183727073078\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.011661223637643416\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5269143290371494,\n \
\ \"acc_stderr\": 0.013752517189717447\n }\n}\n```"
repo_url: https://huggingface.co/TokenBender/pic_7B_mistral_Full_v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|arc:challenge|25_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|gsm8k|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hellaswag|10_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T01-00-48.190749.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-08T01-00-48.190749.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- '**/details_harness|winogrande|5_2023-12-08T01-00-48.190749.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-08T01-00-48.190749.parquet'
- config_name: results
data_files:
- split: 2023_12_08T01_00_48.190749
path:
- results_2023-12-08T01-00-48.190749.parquet
- split: latest
path:
- results_2023-12-08T01-00-48.190749.parquet
---
# Dataset Card for Evaluation run of TokenBender/pic_7B_mistral_Full_v0.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TokenBender/pic_7B_mistral_Full_v0.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TokenBender/pic_7B_mistral_Full_v0.1](https://huggingface.co/TokenBender/pic_7B_mistral_Full_v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TokenBender__pic_7B_mistral_Full_v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-08T01:00:48.190749](https://huggingface.co/datasets/open-llm-leaderboard/details_TokenBender__pic_7B_mistral_Full_v0.1/blob/main/results_2023-12-08T01-00-48.190749.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6335554932812488,
"acc_stderr": 0.03234608898724019,
"acc_norm": 0.6365587293846601,
"acc_norm_stderr": 0.03299054248415427,
"mc1": 0.379436964504284,
"mc1_stderr": 0.016987039266142978,
"mc2": 0.5451115248499341,
"mc2_stderr": 0.015141183727073078
},
"harness|arc:challenge|25": {
"acc": 0.6194539249146758,
"acc_stderr": 0.014188277712349812,
"acc_norm": 0.6390784982935154,
"acc_norm_stderr": 0.014034761386175452
},
"harness|hellaswag|10": {
"acc": 0.6406094403505278,
"acc_stderr": 0.004788412062375695,
"acc_norm": 0.8369846644094802,
"acc_norm_stderr": 0.0036862475593618534
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155236,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155236
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.02275520495954294,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02275520495954294
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.0291265228345868,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.0291265228345868
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.02443301646605246,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.02443301646605246
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228405,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834838,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834838
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6791907514450867,
"acc_stderr": 0.025131000233647886,
"acc_norm": 0.6791907514450867,
"acc_norm_stderr": 0.025131000233647886
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35083798882681566,
"acc_stderr": 0.01596103667523096,
"acc_norm": 0.35083798882681566,
"acc_norm_stderr": 0.01596103667523096
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.025557653981868062,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.025557653981868062
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766006,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766006
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4530638852672751,
"acc_stderr": 0.012713845972358983,
"acc_norm": 0.4530638852672751,
"acc_norm_stderr": 0.012713845972358983
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.019270998708223977,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.019270998708223977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.379436964504284,
"mc1_stderr": 0.016987039266142978,
"mc2": 0.5451115248499341,
"mc2_stderr": 0.015141183727073078
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.011661223637643416
},
"harness|gsm8k|5": {
"acc": 0.5269143290371494,
"acc_stderr": 0.013752517189717447
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
allenai/scico | ---
annotations_creators:
- domain experts
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
task_categories:
- token-classification
task_ids:
- coreference-resolution
paperswithcode_id: scico
tags:
- cross-document-coreference-resolution
- structure-prediction
---
# Dataset Card for SciCo
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [SciCo homepage](https://scico.apps.allenai.org/)
- **Repository:** [SciCo repository](https://github.com/ariecattan/scico)
- **Paper:** [SciCo: Hierarchical Cross-document Coreference for Scientific Concepts](https://openreview.net/forum?id=OFLbgUP04nC)
- **Point of Contact:** [Arie Cattan](arie.cattan@gmail.com)
### Dataset Summary
SciCo consists of clusters of mentions in context and a hierarchy over them.
The corpus is drawn from computer science papers, and the concept mentions are methods and tasks from across CS.
Scientific concepts pose significant challenges: they often take diverse forms (e.g., class-conditional image
synthesis and categorical image generation) or are ambiguous (e.g., network architecture in AI vs.
systems research).
To build SciCo, we develop a new candidate generation
approach built on three resources: a low-coverage KB ([https://paperswithcode.com/](https://paperswithcode.com/)), a noisy hypernym extractor, and curated
candidates.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
The text in the dataset is in English.
## Dataset Structure
### Data Instances
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Data Fields
* `flatten_tokens`: a single list of all tokens in the topic
* `flatten_mentions`: array of mentions, each mention is represented by [start, end, cluster_id]
* `tokens`: array of paragraphs
* `doc_ids`: doc_id of each paragraph in `tokens`
* `metadata`: metadata of each doc_id
* `sentences`: sentences boundaries for each paragraph in `tokens` [start, end]
* `mentions`: array of mentions, each mention is represented by [paragraph_id, start, end, cluster_id]
* `relations`: array of binary relations between cluster_ids [parent, child]
* `id`: id of the topic
* `hard_10` and `hard_20` (only in the test set): flag for 10% or 20% hardest topics based on Levenshtein similarity.
* `source`: source of this topic PapersWithCode (pwc), hypernym or curated.
### Data Splits
| |Train |Validation|Test |
|--------------------|-----:|---------:|----:|
|Topic | 221| 100| 200|
|Documents | 9013| 4120| 8237|
|Mentions | 10925| 4874|10424|
|Clusters | 4080| 1867| 3711|
|Relations | 2514| 1747| 2379|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
## Additional Information
### Dataset Curators
This dataset was initially created by Arie Cattan, Sophie Johnson, Daniel Weld, Ido Dagan, Iz Beltagy, Doug Downey and Tom Hope, while Arie was intern at Allen Institute of Artificial Intelligence.
### Licensing Information
This dataset is distributed under [Apache License 2.0](http://www.apache.org/licenses/LICENSE-2.0).
### Citation Information
```
@inproceedings{
cattan2021scico,
title={SciCo: Hierarchical Cross-Document Coreference for Scientific Concepts},
author={Arie Cattan and Sophie Johnson and Daniel S. Weld and Ido Dagan and Iz Beltagy and Doug Downey and Tom Hope},
booktitle={3rd Conference on Automated Knowledge Base Construction},
year={2021},
url={https://openreview.net/forum?id=OFLbgUP04nC}
}
```
### Contributions
Thanks to [@ariecattan](https://github.com/ariecattan) for adding this dataset.
|
open-cn-llm-leaderboard/mmlu_asc | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: validation
path: data/validation-*
- split: dev
path: data/dev-*
dataset_info:
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 3483726.5
num_examples: 7021
- name: validation
num_bytes: 763484
num_examples: 1531
- name: dev
num_bytes: 125353
num_examples: 285
download_size: 2482166
dataset_size: 4372563.5
---
|
Fearao/guba_eastmoney | ---
task_categories:
- text-classification
language:
- zh
---
数据来自东方财富股吧的评论,经过人工label |
gryffindor-ISWS/stable-diffusion-2-1-without-images | ---
license: gpl-3.0
task_categories:
- text-to-image
language:
- en
tags:
- art
size_categories:
- 1K<n<10K
--- |
JoffreyMa/BGDIA704_faces | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: int64
- name: genre
dtype: int64
splits:
- name: train
num_bytes: 942521828.16
num_examples: 192576
download_size: 900725876
dataset_size: 942521828.16
---
# Dataset Card for "BGDIA704_faces"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sharathhebbar24/Indian-Constitution | ---
license: apache-2.0
task_categories:
- text-classification
- text-generation
- text2text-generation
language:
- en
---
# Indian Constitution Dataset
The dataset can be used for text classification, text generation and text2text generation |
AIRI-NLP/quality_counter_new_2048 | ---
dataset_info:
features:
- name: context
dtype: string
- name: word
dtype: string
- name: claim
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 557027166
num_examples: 20000
- name: validation
num_bytes: 226226606
num_examples: 8000
- name: test
num_bytes: 56238220
num_examples: 2300
download_size: 26618603
dataset_size: 839491992
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
lmsys/mt_bench_human_judgments | ---
dataset_info:
features:
- name: question_id
dtype: int64
- name: model_a
dtype: string
- name: model_b
dtype: string
- name: winner
dtype: string
- name: judge
dtype: string
- name: conversation_a
list:
- name: content
dtype: string
- name: role
dtype: string
- name: conversation_b
list:
- name: content
dtype: string
- name: role
dtype: string
- name: turn
dtype: int64
splits:
- name: human
num_bytes: 15003469
num_examples: 3355
- name: gpt4_pair
num_bytes: 10679650
num_examples: 2400
download_size: 1388888
dataset_size: 25683119
license: cc-by-4.0
task_categories:
- conversational
- question-answering
language:
- en
size_categories:
- 1K<n<10K
---
## Content
This dataset contains 3.3K expert-level pairwise human preferences for model responses generated by 6 models in response to 80 MT-bench questions.
The 6 models are GPT-4, GPT-3.5, Claud-v1, Vicuna-13B, Alpaca-13B, and LLaMA-13B. The annotators are mostly graduate students with expertise in the topic areas of each of the questions. The details of data collection can be found in our [paper](https://arxiv.org/abs/2306.05685).
## Agreement Calculation
This Colab [notebook](https://colab.research.google.com/drive/1ctgygDRJhVGUJTQy8-bRZCl1WNcT8De6?usp=sharing) shows how to compute the agreement between humans and GPT-4 judge with the dataset. Our results show that humans and GPT-4 judge achieve over 80\% agreement, the same level of agreement between humans.
## Citation
```
@misc{zheng2023judging,
title={Judging LLM-as-a-judge with MT-Bench and Chatbot Arena},
author={Lianmin Zheng and Wei-Lin Chiang and Ying Sheng and Siyuan Zhuang and Zhanghao Wu and Yonghao Zhuang and Zi Lin and Zhuohan Li and Dacheng Li and Eric. P Xing and Hao Zhang and Joseph E. Gonzalez and Ion Stoica},
year={2023},
eprint={2306.05685},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
ashraq/financial-news-articles | ---
dataset_info:
features:
- name: title
dtype: string
- name: text
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 848347009
num_examples: 306242
download_size: 492243206
dataset_size: 848347009
---
# Dataset Card for "financial-news-articles"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
The data was obtained from [here](https://www.kaggle.com/datasets/jeet2016/us-financial-news-articles) |
zolak/twitter_dataset_50_1713172910 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 330647
num_examples: 861
download_size: 167920
dataset_size: 330647
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
IndonesiaAI/stack-split-1_translated | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: qid
dtype: string
- name: question
dtype: string
- name: response_j
dtype: string
- name: response_k
dtype: string
splits:
- name: train
num_bytes: 3206490021
num_examples: 1056803
download_size: 951479401
dataset_size: 3206490021
---
# Dataset Card for "stack-split-1_translated"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
louisbrulenaudet/code-domaine-public-fluvial-navigation-interieure | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code du domaine public fluvial et de la navigation intérieure
source_datasets:
- original
pretty_name: Code du domaine public fluvial et de la navigation intérieure
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code du domaine public fluvial et de la navigation intérieure, non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
liuyanchen1015/MULTI_VALUE_cola_invariant_tag_non_concord | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 443
num_examples: 6
- name: test
num_bytes: 366
num_examples: 6
- name: train
num_bytes: 6767
num_examples: 94
download_size: 9210
dataset_size: 7576
---
# Dataset Card for "MULTI_VALUE_cola_invariant_tag_non_concord"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ovior/twitter_dataset_1713116827 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2279791
num_examples: 7255
download_size: 1280745
dataset_size: 2279791
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_rte_regularized_reflexives | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 14223
num_examples: 30
- name: train
num_bytes: 15681
num_examples: 34
download_size: 30171
dataset_size: 29904
---
# Dataset Card for "MULTI_VALUE_rte_regularized_reflexives"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
edarchimbaud/short-interest-stocks | ---
language:
- en
license: mit
task_categories:
- tabular-regression
dataset_info:
features:
- name: symbol
dtype: string
- name: date
dtype: string
- name: id
dtype: int64
- name: settlement_date
dtype: timestamp[ns]
- name: interest
dtype: float64
- name: avg_daily_share_volume
dtype: float64
- name: days_to_cover
dtype: float64
splits:
- name: train
num_bytes: 8920052
num_examples: 143902
download_size: 1015695
dataset_size: 8920052
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "short-interest-sp500"
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://edarchimbaud.substack.com
- **Repository:** https://github.com/edarchimbaud
- **Point of Contact:** contact@edarchimbaud.com
### Dataset Summary
The short-interest-sp500 dataset provides short interest data for companies listed on the S&P 500 index. This includes the number of shares that have been sold short but have not yet been covered or closed out.
### Supported Tasks and Leaderboards
[N/A]
### Languages
[N/A]
## Dataset Structure
### Data Instances
[N/A]
### Data Fields
- symbol (string): A string representing the ticker symbol or abbreviation used to identify the company.
- date (string): A string representing the date when the data was collected.
- id (int64): A unique integer identifier for each data instance.
- settlement_date (timestamp[ns]): The date by which a buyer must pay for the securities delivered by the seller.
- interest (float64): A floating point number representing the short interest of the company on the specified date.
- avg_daily_share_volume (float64): A floating point number representing the average daily trading volume of the company.
- days_to_cover (float64): A floating point number representing the days to cover metric, which is the number of days volume worth of short interest.
### Data Splits
[N/A]
## Dataset Creation
### Curation Rationale
The short-interest-sp500 dataset was created to facilitate the study of market dynamics, particularly the role of short selling.
### Source Data
#### Initial Data Collection and Normalization
The dataset was compiled from publicly available sources.
### Annotations
#### Annotation process
[N/A]
#### Who are the annotators?
[N/A]
### Personal and Sensitive Information
[N/A]
## Considerations for Using the Data
### Social Impact of Dataset
[N/A]
### Discussion of Biases
[N/A]
### Other Known Limitations
[N/A]
## Additional Information
### Dataset Curators
The short-interest-sp500 dataset was collected by https://edarchimbaud.substack.com.
### Licensing Information
The short-interest-sp500 dataset is licensed under the MIT License.
### Citation Information
> https://edarchimbaud.substack.com, short-interest-sp500 dataset, GitHub repository, https://github.com/edarchimbaud
### Contributions
Thanks to [@edarchimbaud](https://github.com/edarchimbaud) for adding this dataset. |
medmac01/CIRCL_MISP_240K_Embedded | ---
dataset_info:
features:
- name: event_id
dtype: int64
- name: event_title
dtype: string
- name: event_timestamp
dtype: string
- name: event_date
dtype: string
- name: event_tags
dtype: string
- name: category
dtype: string
- name: type
dtype: string
- name: value
dtype: string
- name: attribute_tags
dtype: string
- name: value_to_vectorise
dtype: string
- name: value_vectorised
sequence: float32
splits:
- name: train
num_bytes: 804251505
num_examples: 230429
download_size: 869660198
dataset_size: 804251505
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AlexCambell/HeartFailureDataset | ---
pretty_name: Cardiovascular dataset
size_categories:
- 10K<n<100K
--- |
CyberHarem/yato_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yato/ヤトウ/夜刀 (Arknights)
This is the dataset of yato/ヤトウ/夜刀 (Arknights), containing 127 images and their tags.
The core tags of this character are `horns, brown_hair, breasts, long_hair, blue_eyes, multicolored_hair, pointy_ears, hair_between_eyes, white_hair, fake_horns, large_breasts, mole`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 127 | 255.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yato_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 127 | 209.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yato_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 326 | 418.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yato_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yato_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 30 |  |  |  |  |  | 1girl, bare_shoulders, kirin_(armor), solo, navel, stomach, midriff, fur_trim, looking_at_viewer, cleavage, black_gloves, necklace, black_belt, single_detached_sleeve, simple_background, crop_top, garter_straps, white_background, cowboy_shot, holding_weapon, mole_under_eye, medium_breasts, standing, belt_buckle, smile, black_thighhighs |
| 1 | 8 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, kirin_(armor), midriff, navel, necklace, solo, stomach, black_belt, cowboy_shot, looking_at_viewer, black_gloves, white_background, medium_breasts, pendant, simple_background, single_horn, standing, crop_top, fur_trim, groin, hairband |
| 2 | 17 |  |  |  |  |  | 1girl, ponytail, black_jacket, black_skirt, open_jacket, solo, pleated_skirt, id_card, black_pantyhose, blindfold, holding_sword, short_over_long_sleeves, simple_background, white_background, grey_shirt, closed_mouth, miniskirt, white_shirt |
| 3 | 5 |  |  |  |  |  | 1girl, black_jacket, grey_shirt, open_jacket, solo, upper_body, ponytail, blindfold, closed_mouth, id_card, mask, short_over_long_sleeves, simple_background, black_scarf, blush, purple_hair, white_background, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | kirin_(armor) | solo | navel | stomach | midriff | fur_trim | looking_at_viewer | cleavage | black_gloves | necklace | black_belt | single_detached_sleeve | simple_background | crop_top | garter_straps | white_background | cowboy_shot | holding_weapon | mole_under_eye | medium_breasts | standing | belt_buckle | smile | black_thighhighs | pendant | single_horn | groin | hairband | ponytail | black_jacket | black_skirt | open_jacket | pleated_skirt | id_card | black_pantyhose | blindfold | holding_sword | short_over_long_sleeves | grey_shirt | closed_mouth | miniskirt | white_shirt | upper_body | mask | black_scarf | blush | purple_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:----------------|:-------|:--------|:----------|:----------|:-----------|:--------------------|:-----------|:---------------|:-----------|:-------------|:-------------------------|:--------------------|:-----------|:----------------|:-------------------|:--------------|:-----------------|:-----------------|:-----------------|:-----------|:--------------|:--------|:-------------------|:----------|:--------------|:--------|:-----------|:-----------|:---------------|:--------------|:--------------|:----------------|:----------|:------------------|:------------|:----------------|:--------------------------|:-------------|:---------------|:------------|:--------------|:-------------|:-------|:--------------|:--------|:--------------|
| 0 | 30 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | | X | X | | | X | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 2 | 17 |  |  |  |  |  | X | | | X | | | | | | | | | | | X | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | X | | | | | | | | | | | X | | | X | | | | | | | | | | | | | X | X | | X | | X | | X | | X | X | X | | X | X | X | X | X | X |
|
jianshengli/MLLMs | ---
license: apache-2.0
---
|
birgermoell/ravdess | ---
license: cc-by-nc-sa-4.0
---
The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS)
Creators
Livingstone, Steven R.1
ORCID icon
Russo, Frank A.2
ORCID icon
Description
Citing the RAVDESS
The RAVDESS is released under a Creative Commons Attribution license, so please cite the RAVDESS if it is used in your work in any form. Published academic papers should use the academic paper citation for our PLoS1 paper. Personal works, such as machine learning projects/blog posts, should provide a URL to this Zenodo page, though a reference to our PLoS1 paper would also be appreciated.
Academic paper citation
Livingstone SR, Russo FA (2018) The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English. PLoS ONE 13(5): e0196391. https://doi.org/10.1371/journal.pone.0196391.
Personal use citation
Include a link to this Zenodo page - https://zenodo.org/record/1188976
Commercial Licenses
Commercial licenses for the RAVDESS can be purchased. For more information, please visit our license fee page, or contact us at ravdess@gmail.com.
Contact Information
If you would like further information about the RAVDESS, to purchase a commercial license, or if you experience any issues downloading files, please contact us at ravdess@gmail.com.
Example Videos
Watch a sample of the RAVDESS speech and song videos.
Emotion Classification Users
If you're interested in using machine learning to classify emotional expressions with the RAVDESS, please see our new RAVDESS Facial Landmark Tracking data set [Zenodo project page].
Construction and Validation
Full details on the construction and perceptual validation of the RAVDESS are described in our PLoS ONE paper - https://doi.org/10.1371/journal.pone.0196391.
The RAVDESS contains 7356 files. Each file was rated 10 times on emotional validity, intensity, and genuineness. Ratings were provided by 247 individuals who were characteristic of untrained adult research participants from North America. A further set of 72 participants provided test-retest data. High levels of emotional validity, interrater reliability, and test-retest intrarater reliability were reported. Validation data is open-access, and can be downloaded along with our paper from PLoS ONE.
Description
The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS) contains 7356 files (total size: 24.8 GB). The database contains 24 professional actors (12 female, 12 male), vocalizing two lexically-matched statements in a neutral North American accent. Speech includes calm, happy, sad, angry, fearful, surprise, and disgust expressions, and song contains calm, happy, sad, angry, and fearful emotions. Each expression is produced at two levels of emotional intensity (normal, strong), with an additional neutral expression. All conditions are available in three modality formats: Audio-only (16bit, 48kHz .wav), Audio-Video (720p H.264, AAC 48kHz, .mp4), and Video-only (no sound). Note, there are no song files for Actor_18.
Audio-only files
Audio-only files of all actors (01-24) are available as two separate zip files (~200 MB each):
Speech file (Audio_Speech_Actors_01-24.zip, 215 MB) contains 1440 files: 60 trials per actor x 24 actors = 1440.
Song file (Audio_Song_Actors_01-24.zip, 198 MB) contains 1012 files: 44 trials per actor x 23 actors = 1012.
Audio-Visual and Video-only files
Video files are provided as separate zip downloads for each actor (01-24, ~500 MB each), and are split into separate speech and song downloads:
Speech files (Video_Speech_Actor_01.zip to Video_Speech_Actor_24.zip) collectively contains 2880 files: 60 trials per actor x 2 modalities (AV, VO) x 24 actors = 2880.
Song files (Video_Song_Actor_01.zip to Video_Song_Actor_24.zip) collectively contains 2024 files: 44 trials per actor x 2 modalities (AV, VO) x 23 actors = 2024.
File Summary
In total, the RAVDESS collection includes 7356 files (2880+2024+1440+1012 files).
File naming convention
Each of the 7356 RAVDESS files has a unique filename. The filename consists of a 7-part numerical identifier (e.g., 02-01-06-01-02-01-12.mp4). These identifiers define the stimulus characteristics:
Filename identifiers
Modality (01 = full-AV, 02 = video-only, 03 = audio-only).
Vocal channel (01 = speech, 02 = song).
Emotion (01 = neutral, 02 = calm, 03 = happy, 04 = sad, 05 = angry, 06 = fearful, 07 = disgust, 08 = surprised).
Emotional intensity (01 = normal, 02 = strong). NOTE: There is no strong intensity for the 'neutral' emotion.
Statement (01 = "Kids are talking by the door", 02 = "Dogs are sitting by the door").
Repetition (01 = 1st repetition, 02 = 2nd repetition).
Actor (01 to 24. Odd numbered actors are male, even numbered actors are female).
Filename example: 02-01-06-01-02-01-12.mp4
Video-only (02)
Speech (01)
Fearful (06)
Normal intensity (01)
Statement "dogs" (02)
1st Repetition (01)
12th Actor (12)
Female, as the actor ID number is even.
License information
The RAVDESS is released under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, CC BY-NC-SA 4.0
Commercial licenses for the RAVDESS can also be purchased. For more information, please visit our license fee page, or contact us at ravdess@gmail.com.
Related Data sets
RAVDESS Facial Landmark Tracking data set [Zenodo project page].
Dataset from https://zenodo.org/records/1188976
|
mcorsa/swifterX-4k | ---
license: apache-2.0
---
|
rishthak/albums-mixed | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 65233137.0
num_examples: 500
download_size: 65117160
dataset_size: 65233137.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
traintogpb/aihub-koen-translation-integrated-mini-1m | ---
task_categories:
- translation
language:
- en
- ko
size_categories:
- 1M<n<10M
---
# AI Hub Ko-En Translation Dataset (Integrated)
AI Hub의 한-영 번역 관련 데이터셋 8개를 병합한 자료입니다.
병합 시 총 데이터 개수는 10,416,509개 이며, train / validation / test는 8:1:1 비율로 분할되었습니다.
- base-10m: 병합 데이터 100% 사용, 총 10,416,509개
- mini-1m: 병합 데이터 10% 사용 (base-10m의 각 세트 내에서 10% 임의 선택), 총 1,041,651개
- tiny-100k: 병합 데이터 1% 사용 (base-10m의 각 세트 내에서 1% 임의 선택), 총 104,165개
## Subsets
활용한 데이터셋 목록은 다음과 같으며, 데이터셋 이름 옆 번호는 aihubshell에서의 datasetkey입니다.
- [전문분야 한영 말뭉치](https://www.aihub.or.kr/aihubdata/data/view.do?currMenu=&topMenu=&aihubDataSe=data&dataSetSn=111) (111)
- 총 개수: 1,350,000
- 중복 제거 후 개수: 1,350,000
- 사용 칼럼: '한국어', '영어'
- [한국어-영어 번역 말뭉치(기술과학)](https://www.aihub.or.kr/aihubdata/data/view.do?currMenu=&topMenu=&aihubDataSe=data&dataSetSn=124) (124)
- 총 개수: 1,344,631
- 중복 제거 후 개수: 1,344,631
- 사용 칼럼: 'ko', 'en'
- [한국어-영어 번역 말뭉치(사회과학)](https://www.aihub.or.kr/aihubdata/data/view.do?currMenu=&topMenu=&aihubDataSe=data&dataSetSn=125) (125)
- 총 개수: 1,361,845
- 중복 제거 후 개수: 1,361,825
- 사용 칼럼: 'ko', 'en'
- [한국어-영어 번역(병렬) 말뭉치](https://www.aihub.or.kr/aihubdata/data/view.do?currMenu=&topMenu=&aihubDataSe=data&dataSetSn=126) (126)
- 총 개수: 1,602,418
- 중복 제거 후 개수: 1,599,924
- 사용 칼럼: '원문', '번역문'
- [산업정보 연계 주요국 특허 영-한 데이터](https://www.aihub.or.kr/aihubdata/data/view.do?currMenu=&topMenu=&aihubDataSe=data&dataSetSn=563) (563)
- 총 개수: 359,999
- 중복 제거 후 개수: 358,424
- 사용 칼럼: 'astrt_cont_kor', 'astrt_cont_eng'
- [일상생활 및 구어체 한-영 번역 병렬 말뭉치 데이터](https://www.aihub.or.kr/aihubdata/data/view.do?currMenu=&topMenu=&aihubDataSe=data&dataSetSn=71265) (71265)
- 총 개수: 2,700,345
- 중복 제거 후 개수: 2,486,058
- 사용 칼럼: 'ko', 'en'
- [기술과학 분야 한-영 번역 병렬 말뭉치 데이터](https://www.aihub.or.kr/aihubdata/data/view.do?currMenu=&topMenu=&aihubDataSe=data&dataSetSn=71266) (71266)
- 총 개수: 1,350,162
- 중복 제거 후 개수: 1,328,987
- 사용 칼럼: 'ko', 'en'
- [방송콘텐츠 한국어-영어 번역 말뭉치](https://www.aihub.or.kr/aihubdata/data/view.do?currMenu=&topMenu=&aihubDataSe=data&dataSetSn=71382) (71382)
- 총 개수: 587,084
- 중복 제거 후 개수: 586,660
- 사용 칼럼: '원문', '최종번역문'
|
cmagganas/generAd | ---
dataset_info:
features:
- name: name
dtype: string
- name: description
dtype: string
- name: ad
dtype: string
splits:
- name: train
num_bytes: 3173
num_examples: 5
download_size: 7542
dataset_size: 3173
---
# Dataset Card for "generAd"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GreeneryScenery/SheepsScribbleV2 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: image
dtype: image
- name: scribble_image
dtype: image
splits:
- name: train
num_bytes: 8059241532.25
num_examples: 32719
download_size: 8037730689
dataset_size: 8059241532.25
---
# Dataset Card for "SheepsScribbleV2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Thaweewat/thai-med-pack | ---
license: mit
---
|
bcombs/autotrain-data-docid | ---
task_categories:
- text-classification
---
# AutoTrain Dataset for project: docid
## Dataset Description
This dataset has been automatically processed by AutoTrain for project docid.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "MetLife-Walker Information_HI_2023.3.29_14.17_C_B (1).docx.pdf",
"feat_url": "datasaur://static/5732/2a298b78-1c2c-4ff8-ad49-357670dd5ea7.pdf",
"target": 0,
"feat_CarrierName": "Met Life",
"feat_ProductTypes": "Hospital Indemnity"
},
{
"text": "Cima Telecom Inc_Prop (002)_ (2).docx.pdf",
"feat_url": "datasaur://static/5732/8adee066-55c4-4f8d-8dcd-53d5fdb42732.pdf",
"target": 0,
"feat_CarrierName": "Met Life",
"feat_ProductTypes": "Basic Life;Basic AD&D;Voluntary Life;Voluntary AD&D;Voluntary Dependent AD&D;Short-term Disability;Long-term Disability;Dental;Vision"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"feat_url": "Value(dtype='string', id=None)",
"target": "ClassLabel(names=['Proposal', 'Summary (including SBC)'], id=None)",
"feat_CarrierName": "Value(dtype='string', id=None)",
"feat_ProductTypes": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 15 |
| valid | 5 |
|
open-llm-leaderboard/details_mistralai__Mistral-7B-v0.1 | ---
pretty_name: Evaluation run of mistralai/Mistral-7B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 6 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mistralai__Mistral-7B-v0.1\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-02T13:02:14.153054](https://huggingface.co/datasets/open-llm-leaderboard/details_mistralai__Mistral-7B-v0.1/blob/main/results_2023-12-02T13-02-14.153054.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3707354056103108,\n\
\ \"acc_stderr\": 0.013304267705458433\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.3707354056103108,\n \"acc_stderr\": 0.013304267705458433\n\
\ }\n}\n```"
repo_url: https://huggingface.co/mistralai/Mistral-7B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|arc:challenge|25_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T23_48_21.884715
path:
- '**/details_harness|drop|3_2023-10-25T23-48-21.884715.parquet'
- split: 2023_10_26T01_29_53.089924
path:
- '**/details_harness|drop|3_2023-10-26T01-29-53.089924.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-26T01-29-53.089924.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T23_48_21.884715
path:
- '**/details_harness|gsm8k|5_2023-10-25T23-48-21.884715.parquet'
- split: 2023_10_26T01_29_53.089924
path:
- '**/details_harness|gsm8k|5_2023-10-26T01-29-53.089924.parquet'
- split: 2023_12_01T11_13_53.246042
path:
- '**/details_harness|gsm8k|5_2023-12-01T11-13-53.246042.parquet'
- split: 2023_12_02T13_01_55.687268
path:
- '**/details_harness|gsm8k|5_2023-12-02T13-01-55.687268.parquet'
- split: 2023_12_02T13_02_14.153054
path:
- '**/details_harness|gsm8k|5_2023-12-02T13-02-14.153054.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-02T13-02-14.153054.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hellaswag|10_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-27T15-30-59.039834.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-27T15-30-59.039834.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-27T15-30-59.039834.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T23_48_21.884715
path:
- '**/details_harness|winogrande|5_2023-10-25T23-48-21.884715.parquet'
- split: 2023_10_26T01_29_53.089924
path:
- '**/details_harness|winogrande|5_2023-10-26T01-29-53.089924.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-26T01-29-53.089924.parquet'
- config_name: results
data_files:
- split: 2023_09_27T15_30_59.039834
path:
- results_2023-09-27T15-30-59.039834.parquet
- split: 2023_10_25T23_48_21.884715
path:
- results_2023-10-25T23-48-21.884715.parquet
- split: 2023_10_26T01_29_53.089924
path:
- results_2023-10-26T01-29-53.089924.parquet
- split: 2023_12_01T11_13_53.246042
path:
- results_2023-12-01T11-13-53.246042.parquet
- split: 2023_12_02T13_01_55.687268
path:
- results_2023-12-02T13-01-55.687268.parquet
- split: 2023_12_02T13_02_14.153054
path:
- results_2023-12-02T13-02-14.153054.parquet
- split: latest
path:
- results_2023-12-02T13-02-14.153054.parquet
---
# Dataset Card for Evaluation run of mistralai/Mistral-7B-v0.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mistralai/Mistral-7B-v0.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 6 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mistralai__Mistral-7B-v0.1",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T13:02:14.153054](https://huggingface.co/datasets/open-llm-leaderboard/details_mistralai__Mistral-7B-v0.1/blob/main/results_2023-12-02T13-02-14.153054.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3707354056103108,
"acc_stderr": 0.013304267705458433
},
"harness|gsm8k|5": {
"acc": 0.3707354056103108,
"acc_stderr": 0.013304267705458433
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4 | ---
pretty_name: Evaluation run of jondurbin/airoboros-13b-gpt4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-13b-gpt4](https://huggingface.co/jondurbin/airoboros-13b-gpt4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T04:00:44.911684](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4/blob/main/results_2023-10-23T04-00-44.911684.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.014681208053691275,\n\
\ \"em_stderr\": 0.001231711314310859,\n \"f1\": 0.07406564597315451,\n\
\ \"f1_stderr\": 0.0017844772735649754,\n \"acc\": 0.4182714775789221,\n\
\ \"acc_stderr\": 0.009732871523024014\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.014681208053691275,\n \"em_stderr\": 0.001231711314310859,\n\
\ \"f1\": 0.07406564597315451,\n \"f1_stderr\": 0.0017844772735649754\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07884761182714177,\n \
\ \"acc_stderr\": 0.00742339051987324\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7576953433307024,\n \"acc_stderr\": 0.012042352526174787\n\
\ }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-13b-gpt4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|arc:challenge|25_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T04_00_44.911684
path:
- '**/details_harness|drop|3_2023-10-23T04-00-44.911684.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T04-00-44.911684.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T04_00_44.911684
path:
- '**/details_harness|gsm8k|5_2023-10-23T04-00-44.911684.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T04-00-44.911684.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hellaswag|10_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:07:58.585031.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T14:07:58.585031.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T14:07:58.585031.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T04_00_44.911684
path:
- '**/details_harness|winogrande|5_2023-10-23T04-00-44.911684.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T04-00-44.911684.parquet'
- config_name: results
data_files:
- split: 2023_08_18T14_07_58.585031
path:
- results_2023-08-18T14:07:58.585031.parquet
- split: 2023_10_23T04_00_44.911684
path:
- results_2023-10-23T04-00-44.911684.parquet
- split: latest
path:
- results_2023-10-23T04-00-44.911684.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-13b-gpt4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-13b-gpt4](https://huggingface.co/jondurbin/airoboros-13b-gpt4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T04:00:44.911684](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4/blob/main/results_2023-10-23T04-00-44.911684.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.014681208053691275,
"em_stderr": 0.001231711314310859,
"f1": 0.07406564597315451,
"f1_stderr": 0.0017844772735649754,
"acc": 0.4182714775789221,
"acc_stderr": 0.009732871523024014
},
"harness|drop|3": {
"em": 0.014681208053691275,
"em_stderr": 0.001231711314310859,
"f1": 0.07406564597315451,
"f1_stderr": 0.0017844772735649754
},
"harness|gsm8k|5": {
"acc": 0.07884761182714177,
"acc_stderr": 0.00742339051987324
},
"harness|winogrande|5": {
"acc": 0.7576953433307024,
"acc_stderr": 0.012042352526174787
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kanishka/counterfactual_babylm_aann_indef_articles_with_pl_nouns_removal | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 581810331
num_examples: 11662188
- name: validation
num_bytes: 56120230
num_examples: 1026747
download_size: 421777159
dataset_size: 637930561
---
# Dataset Card for "counterfactual_babylm_aann_indef_articles_with_pl_nouns_removal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wecover/OPUS_News-Commentary | ---
configs:
- config_name: default
data_files:
- split: train
path: '*/*/train.parquet'
- split: valid
path: '*/*/valid.parquet'
- split: test
path: '*/*/test.parquet'
- config_name: ar
data_files:
- split: train
path: '*/*ar*/train.parquet'
- split: test
path: '*/*ar*/test.parquet'
- split: valid
path: '*/*ar*/valid.parquet'
- config_name: cs
data_files:
- split: train
path: '*/*cs*/train.parquet'
- split: test
path: '*/*cs*/test.parquet'
- split: valid
path: '*/*cs*/valid.parquet'
- config_name: de
data_files:
- split: train
path: '*/*de*/train.parquet'
- split: test
path: '*/*de*/test.parquet'
- split: valid
path: '*/*de*/valid.parquet'
- config_name: en
data_files:
- split: train
path: '*/*en*/train.parquet'
- split: test
path: '*/*en*/test.parquet'
- split: valid
path: '*/*en*/valid.parquet'
- config_name: es
data_files:
- split: train
path: '*/*es*/train.parquet'
- split: test
path: '*/*es*/test.parquet'
- split: valid
path: '*/*es*/valid.parquet'
- config_name: fr
data_files:
- split: train
path: '*/*fr*/train.parquet'
- split: test
path: '*/*fr*/test.parquet'
- split: valid
path: '*/*fr*/valid.parquet'
- config_name: it
data_files:
- split: train
path: '*/*it*/train.parquet'
- split: test
path: '*/*it*/test.parquet'
- split: valid
path: '*/*it*/valid.parquet'
- config_name: ja
data_files:
- split: train
path: '*/*ja*/train.parquet'
- split: test
path: '*/*ja*/test.parquet'
- split: valid
path: '*/*ja*/valid.parquet'
- config_name: nl
data_files:
- split: train
path: '*/*nl*/train.parquet'
- split: test
path: '*/*nl*/test.parquet'
- split: valid
path: '*/*nl*/valid.parquet'
- config_name: pt
data_files:
- split: train
path: '*/*pt*/train.parquet'
- split: test
path: '*/*pt*/test.parquet'
- split: valid
path: '*/*pt*/valid.parquet'
- config_name: ru
data_files:
- split: train
path: '*/*ru*/train.parquet'
- split: test
path: '*/*ru*/test.parquet'
- split: valid
path: '*/*ru*/valid.parquet'
- config_name: hi
data_files:
- split: train
path: '*/*hi*/train.parquet'
- split: test
path: '*/*hi*/test.parquet'
- split: valid
path: '*/*hi*/valid.parquet'
- config_name: id
data_files:
- split: train
path: '*/*id*/train.parquet'
- split: test
path: '*/*id*/test.parquet'
- split: valid
path: '*/*id*/valid.parquet'
- config_name: kk
data_files:
- split: train
path: '*/*kk*/train.parquet'
- split: test
path: '*/*kk*/test.parquet'
- split: valid
path: '*/*kk*/valid.parquet'
---
|
distilled-from-one-sec-cv12/chunk_64 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1189074136
num_examples: 231698
download_size: 1207674800
dataset_size: 1189074136
---
# Dataset Card for "chunk_64"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shrikant11/myra6 | ---
dataset_info:
features:
- name: image
dtype: image
- name: agnostic-mask
dtype: image
- name: agnostic-v3.2
dtype: image
- name: cloth
dtype: image
- name: cloth-mask
dtype: image
- name: image-densepose
dtype: image
- name: image-parse-agnostic
dtype: image
- name: image-parse
dtype: image
- name: openpose-image
dtype: image
- name: openpose-json
dtype: string
splits:
- name: train
num_bytes: 4496453609.531
num_examples: 11647
download_size: 4161799126
dataset_size: 4496453609.531
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pinzhenchen/alpaca-cleaned-cs | ---
license: cc-by-nc-4.0
task_categories:
- text-generation
- question-answering
language:
- cs
tags:
- instruction tuning
size_categories:
- 10K<n<100K
---
### Data Description
This HF data repository contains the Czech Alpaca dataset used in our study of monolingual versus multilingual instruction tuning.
* [GitHub](https://github.com/hplt-project/monolingual-multilingual-instruction-tuning/tree/main)
* [Paper](https://arxiv.org/abs/2309.08958)
#### Creation
* Machine-translated from [yahma/alpaca-cleaned](https://huggingface.co/datasets/yahma/alpaca-cleaned) into Czech.
#### Usage
* This data is intended to be used for Czech instruction tuning.
* The dataset has roughly 52K instances in the JSON format.
* Each instance has an instruction, an output, and an optional input. An example is shown below:
```
{
"instruction": "Jaké jsou tři základní barvy?",
"input": "",
"output": "Tři základní barvy jsou červená, modrá a žlutá. Tyto barvy se nazývají primárními, protože nemohou být vytvořeny smícháním jiných barev a všechny ostatní barvy mohou být vytvořeny jejich kombinací v různých proporcích. V aditivním barevném systému, který se používá pro světlo, jsou základní barvy červená, zelená a modrá (RGB)."
}
```
#### Known issues
* The machine translation process might have corrupted data containing code, cross-lingual tasks, grammatical error correction tasks, etc.
#### Citation
```
@inproceedings{chen-etal-2024-monolingual,
title="Monolingual or multilingual instruction tuning: Which makes a better {Alpaca}",
author="Pinzhen Chen and Shaoxiong Ji and Nikolay Bogoychev and Andrey Kutuzov and Barry Haddow and Kenneth Heafield",
year="2024",
booktitle = "Findings of the Association for Computational Linguistics: EACL 2024",
}
``` |
ZhangShenao/0.001_idpo_declr_4iters_dataset | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: is_better
dtype: bool
splits:
- name: test_prefs_1
num_bytes: 14055658
num_examples: 2000
- name: train_prefs_1
num_bytes: 108034015
num_examples: 15283
- name: test_prefs_2
num_bytes: 13990519
num_examples: 2000
- name: train_prefs_2
num_bytes: 107443321
num_examples: 15283
- name: test_prefs_3
num_bytes: 14191132
num_examples: 2000
- name: train_prefs_3
num_bytes: 109012222
num_examples: 15283
download_size: 203652253
dataset_size: 366726867
configs:
- config_name: default
data_files:
- split: test_prefs_1
path: data/test_prefs_1-*
- split: train_prefs_1
path: data/train_prefs_1-*
- split: test_prefs_2
path: data/test_prefs_2-*
- split: train_prefs_2
path: data/train_prefs_2-*
- split: test_prefs_3
path: data/test_prefs_3-*
- split: train_prefs_3
path: data/train_prefs_3-*
---
# Dataset Card for "0.001_idpo_declr_4iters_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JangLinhe/lrs3 | ---
license: apache-2.0
---
|
AdapterOcean/med_alpaca_standardized_cluster_29 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 141690732
num_examples: 14936
download_size: 40375435
dataset_size: 141690732
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_29"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jclian91/open_domain_triple_extraction | ---
license: mit
---
开放领域三元组抽取数据(主要为人物关系,职务头衔等),个人手动收集整理。
训练数据(train):spo.json,共 5259 条数据。
测试数据(test):evaluate_data.xlsx,共 100 条数据。 |
openaccess-ai-collective/b0acea6ce295e0a9b16250cfc903cf0c | Invalid username or password. |
hemachandher/your_dataset_name | ---
dataset_info:
features:
- name: question_latest
dtype: string
- name: latest_info
dtype: string
splits:
- name: train
num_bytes: 346
num_examples: 1
download_size: 3545
dataset_size: 346
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
NeuML/wikipedia-20240101 | ---
annotations_creators:
- no-annotation
language:
- en
language_creators:
- found
license:
- cc-by-sa-3.0
- gfdl
multilinguality:
- monolingual
pretty_name: Wikipedia English January 2024
size_categories:
- 1M<n<10M
source_datasets: []
tags:
- pretraining
- language modelling
- wikipedia
- web
task_categories: []
task_ids: []
---
# Dataset Card for Wikipedia English January 2024
Dataset created using this [repo](https://huggingface.co/datasets/NeuML/wikipedia) with a January 2024 Wikipedia snapshot.
This repo also has a precomputed pageviews database. This database has the aggregated number of views for each page in Wikipedia. This file is built using the Wikipedia [Pageview complete dumps](https://dumps.wikimedia.org/other/pageview_complete/readme.html)
|
pseeej/animal-crossing-data | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 7209776.0
num_examples: 389
download_size: 7181848
dataset_size: 7209776.0
---
# Dataset Card for "animal-crossing-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
UrbanJoe/LlamaMaster | ---
license: cc0-1.0
---
|
saibo/bookcorpus_deduplicated_small | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7321888
num_examples: 100000
download_size: 4495653
dataset_size: 7321888
---
# Dataset Card for "bookcorpus_deduplicated_small"
First 10K(0.25%) examples of [bookcorpus_deduplicated](https://huggingface.co/datasets/saibo/bookcorpus_deduplicated)
size: 7.4MB
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ai2lumos/lumos_maths_plan_onetime | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- language-agent
- maths
- reasoning
size_categories:
- 10K<n<100K
---
# 🪄 Agent Lumos: Unified and Modular Training for Open-Source Language Agents
<p align="center">
🌐<a href="https://allenai.github.io/lumos">[Website]</a>
📝<a href="https://arxiv.org/abs/2311.05657">[Paper]</a>
🤗<a href="https://huggingface.co/datasets?sort=trending&search=ai2lumos">[Data]</a>
🤗<a href="https://huggingface.co/models?sort=trending&search=ai2lumos">[Model]</a>
🤗<a href="https://huggingface.co/spaces/ai2lumos/lumos_data_demo">[Demo]</a>
</p>
We introduce 🪄**Lumos**, Language Agents with **Unified** Formats, **Modular** Design, and **Open-Source** LLMs. **Lumos** unifies a suite of complex interactive tasks and achieves competitive performance with GPT-4/3.5-based and larger open-source agents.
**Lumos** has following features:
* 🧩 **Modular Architecture**:
- 🧩 **Lumos** consists of planning, grounding, and execution modules built based on LLAMA-2-7B/13B and off-the-shelf APIs.
- 🤗 **Lumos** utilizes a unified data format that encompasses multiple task types, thereby enabling the developed agent framework to conveniently support a range of interactive tasks.
* 🌍 **Diverse Training Data**:
- 🌍 **Lumos** is trained with ~56K diverse high-quality subgoal/action annotations from ground-truth reasoning steps in existing benchmarks with GPT-4.
- ⚒️ **Lumos** data can be instrumental for future research in developing open-source agents for complex interactive tasks.
* 🚀 **Competitive Performance**:
- 🚀 **Lumos** is comparable or even beats **GPT-series** agents on web/complex QA tasks Mind2Web and HotpotQA, and **larger open agents** on math and multimodal tasks.
- 🚀 **Lumos** exceeds contemporaneous agents that have been **fine-tuned** with in-domain HotpotQA, Mind2Web and ScienceQA annotations, such as **FiReAct**, **AgentLM**, and **AutoAct**.
- 🚀 **Lumos** performs better than open agent baseline formulations including **chain-of-thoughts** and **integrated** training.
- 🚀 **Lumos** surpasses larger open LLM agents and domain-specific agents on unseen tasks, WebShop and InterCode_SQL.
## Data Overview
`lumos_maths_plan_onetime` is the data for training **planning** module on **maths** task in **Lumos-Onetime (Lumos-O)** formulation.
The source of the training annotation training data is shown below:
| Task | Number |
|---|---|
|PRM800K|10000|
|GSM8K|7473|
|ASDiv|2305|
## Models Trained with the Data
`lumos_maths_plan_onetime` is used to train the following models.
|Model|Huggingface Repo|
|---|---|
|`lumos_maths_plan_onetime`| [🤗Huggingface Repo](https://huggingface.co/ai2lumos/lumos_maths_plan_onetime) |
|`lumos_maths_plan_onetime-13B`| [🤗Huggingface Repo](https://huggingface.co/ai2lumos/lumos_maths_plan_onetime-13B) |
## Citation
If you find this work is relevant with your research, please feel free to cite our work!
```
@article{yin2023lumos,
title={Agent Lumos: Unified and Modular Training for Open-Source Language Agents},
author={Yin, Da and Brahman, Faeze and Ravichander, Abhilasha and Chandu, Khyathi and Chang, Kai-Wei and Choi, Yejin and Lin, Bill Yuchen},
journal={arXiv preprint arXiv:2311.05657},
year={2023}
}
``` |
AbderrahmanSkiredj1/ahadith_translation_34k | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 56339232
num_examples: 34088
download_size: 19500537
dataset_size: 56339232
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joey234/mmlu-anatomy-rule-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 69096
num_examples: 135
download_size: 38667
dataset_size: 69096
---
# Dataset Card for "mmlu-anatomy-rule-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
s-nlp/ru_paradetox | ---
license: openrail++
task_categories:
- text-generation
language:
- ru
---
# ParaDetox: Detoxification with Parallel Data (Russian)
This repository contains information about Russian Paradetox dataset -- the first parallel corpus for the detoxification task -- as well as models for the detoxification of Russian texts.
## ParaDetox Collection Pipeline
The ParaDetox Dataset collection was done via [Yandex.Toloka](https://toloka.yandex.com/) crowdsource platform. The collection was done in three steps:
* *Task 1:* **Generation of Paraphrases**: The first crowdsourcing task asks users to eliminate toxicity in a given sentence while keeping the content.
* *Task 2:* **Content Preservation Check**: We show users the generated paraphrases along with their original variants and ask them to indicate if they have close meanings.
* *Task 3:* **Toxicity Check**: Finally, we check if the workers succeeded in removing toxicity.
All these steps were done to ensure high quality of the data and make the process of collection automated. For more details please refer to the original paper.
## Detoxification model
**New SOTA** for detoxification task -- ruT5 (base) model trained on Russian ParaDetox dataset -- we released online in HuggingFace🤗 repository [here](https://huggingface.co/s-nlp/ruT5-base-detox).
You can also check out our [demo](https://detoxifier.nlp.zhores.net/junction/) and telegram [bot](https://t.me/rudetoxifierbot).
## Citation
```
@article{dementievarusse,
title={RUSSE-2022: Findings of the First Russian Detoxification Shared Task Based on Parallel Corpora},
author={Dementieva, Daryna and Logacheva, Varvara and Nikishina, Irina and Fenogenova, Alena and Dale, David and Krotova, Irina and Semenov, Nikita and Shavrina, Tatiana and Panchenko, Alexander}
}
```
## Contacts
If you find some issue, do not hesitate to add it to [Github Issues](https://github.com/s-nlp/russe_detox_2022).
For any questions, please contact: Daryna Dementieva (dardem96@gmail.com) |
freshpearYoon/train_free_3 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9605030200
num_examples: 10000
download_size: 1585017803
dataset_size: 9605030200
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/VALUE_sst2_got | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: sentence
dtype: string
- name: label
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 4558
num_examples: 30
- name: test
num_bytes: 8206
num_examples: 56
- name: train
num_bytes: 134987
num_examples: 1160
download_size: 76260
dataset_size: 147751
---
# Dataset Card for "VALUE_sst2_got"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-1.4.1 | ---
pretty_name: Evaluation run of jondurbin/airoboros-l2-13b-gpt4-1.4.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-l2-13b-gpt4-1.4.1](https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-1.4.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-1.4.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T19:03:13.374959](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-1.4.1/blob/main/results_2023-10-22T19-03-13.374959.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0026216442953020135,\n\
\ \"em_stderr\": 0.0005236685642965807,\n \"f1\": 0.07133494127516771,\n\
\ \"f1_stderr\": 0.0015039896976380969,\n \"acc\": 0.40148895416572666,\n\
\ \"acc_stderr\": 0.00972321783657909\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0026216442953020135,\n \"em_stderr\": 0.0005236685642965807,\n\
\ \"f1\": 0.07133494127516771,\n \"f1_stderr\": 0.0015039896976380969\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06974981046247157,\n \
\ \"acc_stderr\": 0.00701638957101385\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7332280978689818,\n \"acc_stderr\": 0.012430046102144331\n\
\ }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-1.4.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|arc:challenge|25_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T19_03_13.374959
path:
- '**/details_harness|drop|3_2023-10-22T19-03-13.374959.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T19-03-13.374959.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T19_03_13.374959
path:
- '**/details_harness|gsm8k|5_2023-10-22T19-03-13.374959.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T19-03-13.374959.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hellaswag|10_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T15:17:43.655120.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T15:17:43.655120.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T15:17:43.655120.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T19_03_13.374959
path:
- '**/details_harness|winogrande|5_2023-10-22T19-03-13.374959.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T19-03-13.374959.parquet'
- config_name: results
data_files:
- split: 2023_07_24T15_17_43.655120
path:
- results_2023-07-24T15:17:43.655120.parquet
- split: 2023_10_22T19_03_13.374959
path:
- results_2023-10-22T19-03-13.374959.parquet
- split: latest
path:
- results_2023-10-22T19-03-13.374959.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-13b-gpt4-1.4.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-1.4.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-13b-gpt4-1.4.1](https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-1.4.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-1.4.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T19:03:13.374959](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-1.4.1/blob/main/results_2023-10-22T19-03-13.374959.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642965807,
"f1": 0.07133494127516771,
"f1_stderr": 0.0015039896976380969,
"acc": 0.40148895416572666,
"acc_stderr": 0.00972321783657909
},
"harness|drop|3": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642965807,
"f1": 0.07133494127516771,
"f1_stderr": 0.0015039896976380969
},
"harness|gsm8k|5": {
"acc": 0.06974981046247157,
"acc_stderr": 0.00701638957101385
},
"harness|winogrande|5": {
"acc": 0.7332280978689818,
"acc_stderr": 0.012430046102144331
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jxu124/visdial | ---
license: cc-by-4.0
dataset_info:
features:
- name: caption
dtype: string
- name: dialog
sequence:
sequence: string
- name: image_path
dtype: string
- name: global_image_id
dtype: string
- name: anns_id
dtype: string
splits:
- name: train
num_bytes: 77657548
num_examples: 123287
- name: test
num_bytes: 3495490
num_examples: 8000
- name: validation
num_bytes: 1408883
num_examples: 2064
download_size: 34814702
dataset_size: 82561921
---
Usage:
```python
from dataclasses import dataclass
import datasets
# load and path setting
ds_visdial = datasets.load_dataset('jxu124/visdial')
path_map = {
"coco/train2014": f"/datasets/coco/train2014",
"coco/val2014": f"/datasets/coco/val2014",
"visdial/VisualDialog_test2018": f"/datasets/visdial/VisualDialog_test2018",
"visdial/VisualDialog_val2018": f"/datasets/visdial/VisualDialog_val2018"
}
# apply to your datasets
@dataclass
class ReplaceImagePath():
path_map: {}
def __call__(self, features):
for k, v in self.path_map.items():
features['image'] = features['image'].replace(k, v)
return features
ds_visdial = ds_visdial.map(ReplaceImagePath(path_map=path_map)).cast_column("image", datasets.Image())
``` |
markmp/marketing_email_test | ---
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 13830
num_examples: 10
download_size: 18502
dataset_size: 13830
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "marketing_email_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nekochu/novel17_train_alpaca_format | ---
license: apache-2.0
---
Credit: AlexanderDoria/novel17_test |
CyberHarem/vanessa_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of vanessa (Fire Emblem)
This is the dataset of vanessa (Fire Emblem), containing 40 images and their tags.
The core tags of this character are `green_hair, green_eyes, long_hair, braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 40 | 26.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vanessa_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 40 | 21.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vanessa_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 58 | 34.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vanessa_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 40 | 26.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vanessa_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 58 | 41.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/vanessa_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/vanessa_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 |  |  |  |  |  | 1girl, elbow_gloves, solo, thighhighs, breastplate, spear, fingerless_gloves, belt, dress, shoulder_armor, white_gloves, holding_weapon, open_mouth, zettai_ryouiki, pegasus_knight_uniform_(fire_emblem), thigh_boots |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | elbow_gloves | solo | thighhighs | breastplate | spear | fingerless_gloves | belt | dress | shoulder_armor | white_gloves | holding_weapon | open_mouth | zettai_ryouiki | pegasus_knight_uniform_(fire_emblem) | thigh_boots |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------|:-------------|:--------------|:--------|:--------------------|:-------|:--------|:-----------------|:---------------|:-----------------|:-------------|:-----------------|:---------------------------------------|:--------------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
hlt-lab/dailydialogsample-negate_previous_utterance | ---
dataset_info:
features:
- name: context
dtype: string
- name: response
dtype: string
- name: reference
dtype: string
splits:
- name: train
num_bytes: 45417
num_examples: 100
download_size: 35523
dataset_size: 45417
---
# Dataset Card for "dailydialogsample-negate_previous_utterance"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/zooey_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of zooey (Granblue Fantasy)
This is the dataset of zooey (Granblue Fantasy), containing 500 images and their tags.
The core tags of this character are `dark_skin, long_hair, dark-skinned_female, white_hair, red_eyes, hair_between_eyes, ahoge, breasts, very_long_hair, medium_breasts, hair_ornament, hair_flower, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 670.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zooey_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 404.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zooey_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1196 | 844.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zooey_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 603.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zooey_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1196 | 1.12 GiB | [Download](https://huggingface.co/datasets/CyberHarem/zooey_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/zooey_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | 1girl, collarbone, official_alternate_costume, solo, white_bikini, bare_shoulders, cleavage, looking_at_viewer, front-tie_bikini_top, hibiscus, blush, navel, simple_background, open_mouth, white_background, upper_body, :d, dragon |
| 1 | 14 |  |  |  |  |  | 1girl, armored_dress, blue_dress, solo, bare_shoulders, looking_at_viewer, breastplate, smile, sword, thighhighs, black_gloves, simple_background, white_background, blush, dragon, open_mouth, shield |
| 2 | 21 |  |  |  |  |  | 1girl, armored_dress, solo, breastplate, holding_sword, bare_shoulders, blue_dress, thighhighs, looking_at_viewer, shield, boots, black_gloves, simple_background, dragon, short_dress, white_background, full_body |
| 3 | 9 |  |  |  |  |  | 1boy, 1girl, blush, female_pubic_hair, hetero, nipples, penis, pussy, sex, solo_focus, vaginal, large_breasts, open_mouth, spread_legs, sweat, bar_censor, navel, clitoris, completely_nude, smile, looking_at_viewer, lying |
| 4 | 7 |  |  |  |  |  | 1girl, ass, looking_at_viewer, solo, anus, blush, nipples, smile, spread_legs, bar_censor, open_mouth, sweat, completely_nude, light_areolae, mosaic_censoring, pussy_juice, shiny_skin, spread_pussy |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | collarbone | official_alternate_costume | solo | white_bikini | bare_shoulders | cleavage | looking_at_viewer | front-tie_bikini_top | hibiscus | blush | navel | simple_background | open_mouth | white_background | upper_body | :d | dragon | armored_dress | blue_dress | breastplate | smile | sword | thighhighs | black_gloves | shield | holding_sword | boots | short_dress | full_body | 1boy | female_pubic_hair | hetero | nipples | penis | pussy | sex | solo_focus | vaginal | large_breasts | spread_legs | sweat | bar_censor | clitoris | completely_nude | lying | ass | anus | light_areolae | mosaic_censoring | pussy_juice | shiny_skin | spread_pussy |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:-----------------------------|:-------|:---------------|:-----------------|:-----------|:--------------------|:-----------------------|:-----------|:--------|:--------|:--------------------|:-------------|:-------------------|:-------------|:-----|:---------|:----------------|:-------------|:--------------|:--------|:--------|:-------------|:---------------|:---------|:----------------|:--------|:--------------|:------------|:-------|:--------------------|:---------|:----------|:--------|:--------|:------|:-------------|:----------|:----------------|:--------------|:--------|:-------------|:-----------|:------------------|:--------|:------|:-------|:----------------|:-------------------|:--------------|:-------------|:---------------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | | | X | | X | | X | | | X | | X | X | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 21 |  |  |  |  |  | X | | | X | | X | | X | | | | | X | | X | | | X | X | X | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | | | | | | | X | | | X | X | | X | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | | X | | | | X | | | X | | | X | | | | | | | | X | | | | | | | | | | | | X | | | | | | | X | X | X | | X | | X | X | X | X | X | X | X |
|
stjarvie/question_to_sql_with_ddl | ---
dataset_info:
features:
- name: question
dtype: string
- name: sql
dtype: string
- name: schema
dtype: string
splits:
- name: train
num_bytes: 1856
num_examples: 10
- name: test
num_bytes: 2005
num_examples: 10
download_size: 6616
dataset_size: 3861
---
# Dataset Card for "question_to_sql_with_ddl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
McAuley-Lab/Amazon-Reviews-2023 | ---
language:
- en
tags:
- recommendation
- reviews
size_categories:
- 10B<n<100B
---
# Amazon Reviews 2023
**Please also visit [amazon-reviews-2023.github.io/](https://amazon-reviews-2023.github.io/) for more details, loading scripts, and preprocessed benchmark files.**
**[April 7, 2024]** We add two useful files:
1. `all_categories.txt`: 34 lines (33 categories + "Unknown"), each line contains a category name.
2. `asin2category.json`: A mapping between `parent_asin` (item ID) to its corresponding category name.
---
<!-- Provide a quick summary of the dataset. -->
This is a large-scale **Amazon Reviews** dataset, collected in **2023** by [McAuley Lab](https://cseweb.ucsd.edu/~jmcauley/), and it includes rich features such as:
1. **User Reviews** (*ratings*, *text*, *helpfulness votes*, etc.);
2. **Item Metadata** (*descriptions*, *price*, *raw image*, etc.);
3. **Links** (*user-item* / *bought together* graphs).
## What's New?
In the Amazon Reviews'23, we provide:
1. **Larger Dataset:** We collected 571.54M reviews, 245.2% larger than the last version;
2. **Newer Interactions:** Current interactions range from May. 1996 to Sep. 2023;
3. **Richer Metadata:** More descriptive features in item metadata;
4. **Fine-grained Timestamp:** Interaction timestamp at the second or finer level;
5. **Cleaner Processing:** Cleaner item metadata than previous versions;
6. **Standard Splitting:** Standard data splits to encourage RecSys benchmarking.
## Basic Statistics
> We define the <b>#R_Tokens</b> as the number of [tokens](https://pypi.org/project/tiktoken/) in user reviews and <b>#M_Tokens</b> as the number of [tokens](https://pypi.org/project/tiktoken/) if treating the dictionaries of item attributes as strings. We emphasize them as important statistics in the era of LLMs.
> We count the number of items based on user reviews rather than item metadata files. Note that some items lack metadata.
### Compared to Previous Versions
| Year | #Review | #User | #Item | #R_Token | #M_Token | #Domain | Timespan |
| ----------- | ---------: | -------: | -------: | ---------: | ------------: | ------------: | ------------: |
| [2013](https://snap.stanford.edu/data/web-Amazon-links.html) | 34.69M | 6.64M | 2.44M | 5.91B | -- | 28 | Jun'96 - Mar'13 |
| [2014](https://cseweb.ucsd.edu/~jmcauley/datasets/amazon/links.html) | 82.83M | 21.13M | 9.86M | 9.16B | 4.14B | 24 | May'96 - Jul'14 |
| [2018](https://cseweb.ucsd.edu/~jmcauley/datasets/amazon_v2/) | 233.10M | 43.53M | 15.17M | 15.73B | 7.99B | 29 | May'96 - Oct'18 |
| <b>[2023](https://)</b> | **571.54M** | **54.51M** | **48.19M** | **30.14B** | **30.78B** | **33** | **May'96 - Sep'23** |
### Grouped by Category
| Category | #User | #Item | #Rating | #R_Token | #M_Token | Download |
| ------------------------ | ------: | ------: | --------: | -------: | -------: | ------------------------------: |
| All_Beauty | 632.0K | 112.6K | 701.5K | 31.6M | 74.1M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/All_Beauty.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_All_Beauty.jsonl.gz' download> meta </a> |
| Amazon_Fashion | 2.0M | 825.9K | 2.5M | 94.9M | 510.5M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Amazon_Fashion.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Amazon_Fashion.jsonl.gz' download> meta </a> |
| Appliances | 1.8M | 94.3K | 2.1M | 92.8M | 95.3M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Appliances.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Appliances.jsonl.gz' download> meta </a> |
| Arts_Crafts_and_Sewing | 4.6M | 801.3K | 9.0M | 350.0M | 695.4M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Arts_Crafts_and_Sewing.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Arts_Crafts_and_Sewing.jsonl.gz' download> meta </a> |
| Automotive | 8.0M | 2.0M | 20.0M | 824.9M | 1.7B | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Automotive.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Automotive.jsonl.gz' download> meta </a> |
| Baby_Products | 3.4M | 217.7K | 6.0M | 323.3M | 218.6M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Baby_Products.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Baby_Products.jsonl.gz' download> meta </a> |
| Beauty_and_Personal_Care | 11.3M | 1.0M | 23.9M | 1.1B | 913.7M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Beauty_and_Personal_Care.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Beauty_and_Personal_Care.jsonl.gz' download> meta </a> |
| Books | 10.3M | 4.4M | 29.5M | 2.9B | 3.7B | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Books.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Books.jsonl.gz' download> meta </a> |
| CDs_and_Vinyl | 1.8M | 701.7K | 4.8M | 514.8M | 287.5M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/CDs_and_Vinyl.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_CDs_and_Vinyl.jsonl.gz' download> meta </a> |
| Cell_Phones_and_Accessories | 11.6M | 1.3M | 20.8M | 935.4M | 1.3B | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Cell_Phones_and_Accessories.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Cell_Phones_and_Accessories.jsonl.gz' download> meta </a> |
| Clothing_Shoes_and_Jewelry | 22.6M | 7.2M | 66.0M | 2.6B | 5.9B | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Clothing_Shoes_and_Jewelry.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Clothing_Shoes_and_Jewelry.jsonl.gz' download> meta </a> |
| Digital_Music | 101.0K | 70.5K | 130.4K | 11.4M | 22.3M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Digital_Music.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Digital_Music.jsonl.gz' download> meta </a> |
| Electronics | 18.3M | 1.6M | 43.9M | 2.7B | 1.7B | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Electronics.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Electronics.jsonl.gz' download> meta </a> |
| Gift_Cards | 132.7K | 1.1K | 152.4K | 3.6M | 630.0K | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Gift_Cards.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Gift_Cards.jsonl.gz' download> meta </a> |
| Grocery_and_Gourmet_Food | 7.0M | 603.2K | 14.3M | 579.5M | 462.8M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Grocery_and_Gourmet_Food.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Grocery_and_Gourmet_Food.jsonl.gz' download> meta </a> |
| Handmade_Products | 586.6K | 164.7K | 664.2K | 23.3M | 125.8M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Handmade_Products.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Handmade_Products.jsonl.gz' download> meta </a> |
| Health_and_Household | 12.5M | 797.4K | 25.6M | 1.2B | 787.2M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Health_and_Household.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Health_and_Household.jsonl.gz' download> meta </a> |
| Health_and_Personal_Care | 461.7K | 60.3K | 494.1K | 23.9M | 40.3M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Health_and_Personal_Care.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Health_and_Personal_Care.jsonl.gz' download> meta </a> |
| Home_and_Kitchen | 23.2M | 3.7M | 67.4M | 3.1B | 3.8B | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Home_and_Kitchen.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Home_and_Kitchen.jsonl.gz' download> meta </a> |
| Industrial_and_Scientific | 3.4M | 427.5K | 5.2M | 235.2M | 363.1M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Industrial_and_Scientific.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Industrial_and_Scientific.jsonl.gz' download> meta </a> |
| Kindle_Store | 5.6M | 1.6M | 25.6M | 2.2B | 1.7B | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Kindle_Store.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Kindle_Store.jsonl.gz' download> meta </a> |
| Magazine_Subscriptions | 60.1K | 3.4K | 71.5K | 3.8M | 1.3M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Magazine_Subscriptions.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Magazine_Subscriptions.jsonl.gz' download> meta </a> |
| Movies_and_TV | 6.5M | 747.8K | 17.3M | 1.0B | 415.5M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Movies_and_TV.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Movies_and_TV.jsonl.gz' download> meta </a> |
| Musical_Instruments | 1.8M | 213.6K | 3.0M | 182.2M | 200.1M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Musical_Instruments.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Musical_Instruments.jsonl.gz' download> meta </a> |
| Office_Products | 7.6M | 710.4K | 12.8M | 574.7M | 682.8M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Office_Products.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Office_Products.jsonl.gz' download> meta </a> |
| Patio_Lawn_and_Garden | 8.6M | 851.7K | 16.5M | 781.3M | 875.1M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Patio_Lawn_and_Garden.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Patio_Lawn_and_Garden.jsonl.gz' download> meta </a> |
| Pet_Supplies | 7.8M | 492.7K | 16.8M | 905.9M | 511.0M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Pet_Supplies.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Pet_Supplies.jsonl.gz' download> meta </a> |
| Software | 2.6M | 89.2K | 4.9M | 179.4M | 67.1M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Software.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Software.jsonl.gz' download> meta </a> |
| Sports_and_Outdoors | 10.3M | 1.6M | 19.6M | 986.2M | 1.3B | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Sports_and_Outdoors.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Sports_and_Outdoors.jsonl.gz' download> meta </a> |
| Subscription_Boxes | 15.2K | 641 | 16.2K | 1.0M | 447.0K | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Subscription_Boxes.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Subscription_Boxes.jsonl.gz' download> meta </a> |
| Tools_and_Home_Improvement | 12.2M | 1.5M | 27.0M | 1.3B | 1.5B | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Tools_and_Home_Improvement.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Tools_and_Home_Improvement.jsonl.gz' download> meta </a> |
| Toys_and_Games | 8.1M | 890.7K | 16.3M | 707.9M | 848.3M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Toys_and_Games.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Toys_and_Games.jsonl.gz' download> meta </a> |
| Video_Games | 2.8M | 137.2K | 4.6M | 347.9M | 137.3M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Video_Games.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Video_Games.jsonl.gz' download> meta </a> |
| Unknown | 23.1M | 13.2M | 63.8M | 3.3B | 232.8M | <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/review_categories/Unknown.jsonl.gz' download> review</a>, <a href='https://datarepo.eng.ucsd.edu/mcauley_group/data/amazon_2023/raw/meta_categories/meta_Unknown.jsonl.gz' download> meta </a> |
> Check Pure ID files and corresponding data splitting strategies in <b>[Common Data Processing](https://amazon-reviews-2023.github.io/data_processing/index.html)</b> section.
## Quick Start
### Load User Reviews
```python
from datasets import load_dataset
dataset = load_dataset("McAuley-Lab/Amazon-Reviews-2023", "raw_review_All_Beauty", trust_remote_code=True)
print(dataset["full"][0])
```
```json
{'rating': 5.0,
'title': 'Such a lovely scent but not overpowering.',
'text': "This spray is really nice. It smells really good, goes on really fine, and does the trick. I will say it feels like you need a lot of it though to get the texture I want. I have a lot of hair, medium thickness. I am comparing to other brands with yucky chemicals so I'm gonna stick with this. Try it!",
'images': [],
'asin': 'B00YQ6X8EO',
'parent_asin': 'B00YQ6X8EO',
'user_id': 'AGKHLEW2SOWHNMFQIJGBECAF7INQ',
'timestamp': 1588687728923,
'helpful_vote': 0,
'verified_purchase': True}
```
### Load Item Metadata
```python
dataset = load_dataset("McAuley-Lab/Amazon-Reviews-2023", "raw_meta_All_Beauty", split="full", trust_remote_code=True)
print(dataset[0])
```
```json
{'main_category': 'All Beauty',
'title': 'Howard LC0008 Leather Conditioner, 8-Ounce (4-Pack)',
'average_rating': 4.8,
'rating_number': 10,
'features': [],
'description': [],
'price': 'None',
'images': {'hi_res': [None,
'https://m.media-amazon.com/images/I/71i77AuI9xL._SL1500_.jpg'],
'large': ['https://m.media-amazon.com/images/I/41qfjSfqNyL.jpg',
'https://m.media-amazon.com/images/I/41w2yznfuZL.jpg'],
'thumb': ['https://m.media-amazon.com/images/I/41qfjSfqNyL._SS40_.jpg',
'https://m.media-amazon.com/images/I/41w2yznfuZL._SS40_.jpg'],
'variant': ['MAIN', 'PT01']},
'videos': {'title': [], 'url': [], 'user_id': []},
'store': 'Howard Products',
'categories': [],
'details': '{"Package Dimensions": "7.1 x 5.5 x 3 inches; 2.38 Pounds", "UPC": "617390882781"}',
'parent_asin': 'B01CUPMQZE',
'bought_together': None,
'subtitle': None,
'author': None}
```
> Check data loading examples and Huggingface datasets APIs in <b>[Common Data Loading](https://amazon-reviews-2023.github.io/data_loading/index.html)</b> section.
## Data Fields
### For User Reviews
| Field | Type | Explanation |
| ----- | ---- | ----------- |
| rating | float | Rating of the product (from 1.0 to 5.0). |
| title | str | Title of the user review. |
| text | str | Text body of the user review. |
| images | list | Images that users post after they have received the product. Each image has different sizes (small, medium, large), represented by the small_image_url, medium_image_url, and large_image_url respectively. |
| asin | str | ID of the product. |
| parent_asin | str | Parent ID of the product. Note: Products with different colors, styles, sizes usually belong to the same parent ID. The “asin” in previous Amazon datasets is actually parent ID. <b>Please use parent ID to find product meta.</b> |
| user_id | str | ID of the reviewer |
| timestamp | int | Time of the review (unix time) |
| verified_purchase | bool | User purchase verification |
| helpful_vote | int | Helpful votes of the review |
### For Item Metadata
| Field | Type | Explanation |
| ----- | ---- | ----------- |
| main_category | str | Main category (i.e., domain) of the product. |
| title | str | Name of the product. |
| average_rating | float | Rating of the product shown on the product page. |
| rating_number | int | Number of ratings in the product. |
| features | list | Bullet-point format features of the product. |
| description | list | Description of the product. |
| price | float | Price in US dollars (at time of crawling). |
| images | list | Images of the product. Each image has different sizes (thumb, large, hi_res). The “variant” field shows the position of image. |
| videos | list | Videos of the product including title and url. |
| store | str | Store name of the product. |
| categories | list | Hierarchical categories of the product. |
| details | dict | Product details, including materials, brand, sizes, etc. |
| parent_asin | str | Parent ID of the product. |
| bought_together | list | Recommended bundles from the websites. |
## Citation
```bibtex
@article{hou2024bridging,
title={Bridging Language and Items for Retrieval and Recommendation},
author={Hou, Yupeng and Li, Jiacheng and He, Zhankui and Yan, An and Chen, Xiusi and McAuley, Julian},
journal={arXiv preprint arXiv:2403.03952},
year={2024}
}
```
## Contact Us
- **Report Bugs**: To report bugs in the dataset, please file an issue on our [GitHub](https://github.com/hyp1231/AmazonReviews2023/issues/new).
- **Others**: For research collaborations or other questions, please email **yphou AT ucsd.edu**. |
sethapun/arithmetic_2md_1to1000 | ---
dataset_info:
features:
- name: expression
dtype: string
- name: answer
dtype: float64
- name: label
dtype:
class_label:
names:
'0': 'false'
'1': 'true'
splits:
- name: train
num_bytes: 61528
num_examples: 2000
- name: validation
num_bytes: 12316
num_examples: 400
download_size: 36193
dataset_size: 73844
---
# Dataset Card for "arithmetic_2md_1to1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kikoweb/padre | ---
license: openrail
---
|
AZSXDCFV123/dataset_repository_name | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
bigscience-data/roots_ar_arabench | ---
language: ar
license: apache-2.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_ar_arabench
# arabench
- Dataset uid: `arabench`
### Description
AraBench is an evaluation suite for dialectal Arabic to English machine translation. AraBench offers 4 coarse, 15 fine-grained and 25 city-level dialect categories, belonging to diverse genres, such as media, chat, religion and travel with varying level of dialectness.
### Homepage
https://alt.qcri.org/resources1/mt/arabench/
### Licensing
- open license
- cc-by-4.0: Creative Commons Attribution 4.0 International
### Speaker Locations
- Northern Africa
- Western Asia
- Algeria
- Egypt
- Morocco
- Jordan
- Sudan
- Tunisia
- Lebanon
- Libya
- Iraq
- Qatar
- Yemen
- Oman
- Saudi Arabia
- Syria
- Palestine
### Sizes
- 0.0018 % of total
- 0.0165 % of ar
### BigScience processing steps
#### Filters applied to: ar
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
|
datajuicer/redpajama-pile-stackexchange-refined-by-data-juicer | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- data-juicer
- pretraining
size_categories:
- 10M<n<100M
---
# RedPajama & The Pile -- StackExchange (refined by Data-Juicer)
A refined version of StackExchange dataset in RedPajama & The Pile by [Data-Juicer](https://github.com/alibaba/data-juicer). Removing some "bad" samples from the original merged dataset to make it higher-quality.
This dataset is usually used to pretrain a Large Language Model.
**Notice**: Here is a small subset for previewing. The whole dataset is available [here](https://dail-wlcb.oss-cn-wulanchabu.aliyuncs.com/LLM_data/our_refined_datasets/pretraining/redpajama-pile-stackexchange-refine-result.jsonl) (About 71GB).
## Dataset Information
- Number of samples: 26,309,203 (Keep ~57.89% from the original dataset)
## Refining Recipe
```yaml
# global parameters
project_name: 'Data-Juicer-stack-exchange'
dataset_path: '/path/to/your/dataset' # path to your dataset directory or file
export_path: '/path/to/your/dataset.jsonl'
np: 50 # number of subprocess to process your dataset
open_tracer: true
# process schedule
# a list of several process operators with their arguments
process:
- clean_email_mapper:
- clean_links_mapper:
- fix_unicode_mapper:
- punctuation_normalization_mapper:
- whitespace_normalization_mapper:
- alphanumeric_filter:
tokenization: false
min_ratio: 0.35 # <3sigma
max_ratio: 0.943 # 3sigma
- average_line_length_filter: # for code
min_len: 20 # >3sigma
max_len: 400 # >3sigma
- character_repetition_filter:
rep_len: 10
max_ratio: 0.4 # >3sigma (0.12)
- flagged_words_filter:
lang: en
tokenization: true
max_ratio: 0.01 # >3sigma
- language_id_score_filter: # remove language filter
min_score: 0.1 # <3sigma
- maximum_line_length_filter: # for code
min_len: 80
- perplexity_filter:
lang: en
max_ppl: 10000 # >3sigma
- special_characters_filter:
min_ratio: 0.232 # 3sigma
max_ratio: 0.7 # >3sigma
- text_length_filter:
min_len: 200
- words_num_filter:
lang: en
tokenization: true
min_num: 100
- word_repetition_filter:
lang: en
tokenization: true
rep_len: 10
max_ratio: 0.8 # >3sigma
- document_simhash_deduplicator: #26309203 left
tokenization: space
window_size: 3
lowercase: true
ignore_pattern: '\n\n'
num_blocks: 9
hamming_distance: 7
``` |
Abzu/RedPajama-Data-1T-arxiv-filtered | ---
dataset_info:
features:
- name: text
dtype: string
- name: meta
dtype: string
- name: red_pajama_subset
dtype: string
splits:
- name: train
num_bytes: 229340859.5333384
num_examples: 3911
download_size: 104435457
dataset_size: 229340859.5333384
---
# Dataset Card for "RedPajama-Data-1T-arxiv-filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rushai-dev/fmcw-vital-signs | ---
license: apache-2.0
---
|
huizhoucheng/summary-auto-train-small-2 | ---
dataset_info:
features:
- name: article
dtype: string
- name: highlights
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 10060340
num_examples: 2571
- name: validation
num_bytes: 485669
num_examples: 133
- name: test
num_bytes: 399200
num_examples: 114
download_size: 6609537
dataset_size: 10945209
---
# Dataset Card for "summary-auto-train-small-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Intuit-GenSRF/es_lawyer_instruct | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: input
dtype: float64
- name: split
dtype: string
- name: text
dtype: string
- name: text_spanish
dtype: string
splits:
- name: train
num_bytes: 16852186
num_examples: 9241
download_size: 7403208
dataset_size: 16852186
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "es_lawyer_instruct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_RaduGabriel__MUZD | ---
pretty_name: Evaluation run of RaduGabriel/MUZD
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RaduGabriel/MUZD](https://huggingface.co/RaduGabriel/MUZD) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RaduGabriel__MUZD\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-14T10:37:51.263631](https://huggingface.co/datasets/open-llm-leaderboard/details_RaduGabriel__MUZD/blob/main/results_2024-02-14T10-37-51.263631.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6315569902368788,\n\
\ \"acc_stderr\": 0.03257903006074675,\n \"acc_norm\": 0.633414948028734,\n\
\ \"acc_norm_stderr\": 0.033238569460214515,\n \"mc1\": 0.48959608323133413,\n\
\ \"mc1_stderr\": 0.017499711430249264,\n \"mc2\": 0.6572688672491508,\n\
\ \"mc2_stderr\": 0.014888678305017567\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6151877133105802,\n \"acc_stderr\": 0.014218371065251102,\n\
\ \"acc_norm\": 0.6680887372013652,\n \"acc_norm_stderr\": 0.013760988200880536\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6719776936865166,\n\
\ \"acc_stderr\": 0.0046853348440386595,\n \"acc_norm\": 0.8653654650468035,\n\
\ \"acc_norm_stderr\": 0.0034063520713417173\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952928,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952928\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337152,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337152\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424648,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424648\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6741935483870968,\n\
\ \"acc_stderr\": 0.026662010578567104,\n \"acc_norm\": 0.6741935483870968,\n\
\ \"acc_norm_stderr\": 0.026662010578567104\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386424,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386424\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902796,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902796\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059285,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059285\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010323,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010323\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565438,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565438\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909476,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909476\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.013964393769899133,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.013964393769899133\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545546,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545546\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41675977653631285,\n\
\ \"acc_stderr\": 0.016489134962438954,\n \"acc_norm\": 0.41675977653631285,\n\
\ \"acc_norm_stderr\": 0.016489134962438954\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4589308996088657,\n\
\ \"acc_stderr\": 0.012727084826799798,\n \"acc_norm\": 0.4589308996088657,\n\
\ \"acc_norm_stderr\": 0.012727084826799798\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n\
\ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000325,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000325\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.02927956741106568,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.02927956741106568\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6119402985074627,\n\
\ \"acc_stderr\": 0.034457899643627506,\n \"acc_norm\": 0.6119402985074627,\n\
\ \"acc_norm_stderr\": 0.034457899643627506\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48959608323133413,\n\
\ \"mc1_stderr\": 0.017499711430249264,\n \"mc2\": 0.6572688672491508,\n\
\ \"mc2_stderr\": 0.014888678305017567\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.813733228097869,\n \"acc_stderr\": 0.010941877955676211\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5860500379075056,\n \
\ \"acc_stderr\": 0.013566991960151778\n }\n}\n```"
repo_url: https://huggingface.co/RaduGabriel/MUZD
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|arc:challenge|25_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|gsm8k|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hellaswag|10_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T10-37-51.263631.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T10-37-51.263631.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- '**/details_harness|winogrande|5_2024-02-14T10-37-51.263631.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-14T10-37-51.263631.parquet'
- config_name: results
data_files:
- split: 2024_02_14T10_37_51.263631
path:
- results_2024-02-14T10-37-51.263631.parquet
- split: latest
path:
- results_2024-02-14T10-37-51.263631.parquet
---
# Dataset Card for Evaluation run of RaduGabriel/MUZD
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RaduGabriel/MUZD](https://huggingface.co/RaduGabriel/MUZD) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RaduGabriel__MUZD",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T10:37:51.263631](https://huggingface.co/datasets/open-llm-leaderboard/details_RaduGabriel__MUZD/blob/main/results_2024-02-14T10-37-51.263631.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6315569902368788,
"acc_stderr": 0.03257903006074675,
"acc_norm": 0.633414948028734,
"acc_norm_stderr": 0.033238569460214515,
"mc1": 0.48959608323133413,
"mc1_stderr": 0.017499711430249264,
"mc2": 0.6572688672491508,
"mc2_stderr": 0.014888678305017567
},
"harness|arc:challenge|25": {
"acc": 0.6151877133105802,
"acc_stderr": 0.014218371065251102,
"acc_norm": 0.6680887372013652,
"acc_norm_stderr": 0.013760988200880536
},
"harness|hellaswag|10": {
"acc": 0.6719776936865166,
"acc_stderr": 0.0046853348440386595,
"acc_norm": 0.8653654650468035,
"acc_norm_stderr": 0.0034063520713417173
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952928,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337152,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337152
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424648,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424648
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6741935483870968,
"acc_stderr": 0.026662010578567104,
"acc_norm": 0.6741935483870968,
"acc_norm_stderr": 0.026662010578567104
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386424,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386424
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902796,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473082,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059285,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010323,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565438,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565438
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909476,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909476
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899133,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899133
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545546,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545546
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41675977653631285,
"acc_stderr": 0.016489134962438954,
"acc_norm": 0.41675977653631285,
"acc_norm_stderr": 0.016489134962438954
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4589308996088657,
"acc_stderr": 0.012727084826799798,
"acc_norm": 0.4589308996088657,
"acc_norm_stderr": 0.012727084826799798
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000325,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000325
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.02927956741106568,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.02927956741106568
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6119402985074627,
"acc_stderr": 0.034457899643627506,
"acc_norm": 0.6119402985074627,
"acc_norm_stderr": 0.034457899643627506
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.48959608323133413,
"mc1_stderr": 0.017499711430249264,
"mc2": 0.6572688672491508,
"mc2_stderr": 0.014888678305017567
},
"harness|winogrande|5": {
"acc": 0.813733228097869,
"acc_stderr": 0.010941877955676211
},
"harness|gsm8k|5": {
"acc": 0.5860500379075056,
"acc_stderr": 0.013566991960151778
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/jessica_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of jessica/ジェシカ (Granblue Fantasy)
This is the dataset of jessica/ジェシカ (Granblue Fantasy), containing 97 images and their tags.
The core tags of this character are `long_hair, black_hair, goggles_on_head, breasts, black_eyes, large_breasts, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 97 | 85.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jessica_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 97 | 61.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jessica_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 192 | 112.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jessica_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 97 | 80.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jessica_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 192 | 138.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jessica_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/jessica_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, black_thighhighs, goggles, solo, white_gloves, looking_at_viewer, smile, sitting, cleavage, weapon, blush |
| 1 | 5 |  |  |  |  |  | 1girl, black_thighhighs, china_dress, cleavage_cutout, goggles, looking_at_viewer, smile, solo, white_gloves, blush, side_slit, ass, bare_shoulders, breast_hold |
| 2 | 25 |  |  |  |  |  | 1girl, goggles, solo, cleavage, looking_at_viewer, smile, white_gloves, animal_ears, blush, black_bikini, frilled_bikini, navel, simple_background, medium_breasts, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_thighhighs | goggles | solo | white_gloves | looking_at_viewer | smile | sitting | cleavage | weapon | blush | china_dress | cleavage_cutout | side_slit | ass | bare_shoulders | breast_hold | animal_ears | black_bikini | frilled_bikini | navel | simple_background | medium_breasts | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:----------|:-------|:---------------|:--------------------|:--------|:----------|:-----------|:---------|:--------|:--------------|:------------------|:------------|:------|:-----------------|:--------------|:--------------|:---------------|:-----------------|:--------|:--------------------|:-----------------|:-------------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | | | X | X | X | X | X | X | X | | | | | | | |
| 2 | 25 |  |  |  |  |  | X | | X | X | X | X | X | | X | | X | | | | | | | X | X | X | X | X | X | X |
|
trevfran/perfil | ---
license: other
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_120 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1440244480.0
num_examples: 280640
download_size: 1476753470
dataset_size: 1440244480.0
---
# Dataset Card for "chunk_120"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AgentWaller/dutch-oasst1-qa-format | ---
license: apache-2.0
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 8100241
num_examples: 9843
- name: validation
num_bytes: 409962
num_examples: 517
download_size: 5049986
dataset_size: 8510203
---
|
automorphic/runhouse | ---
dataset_info:
features:
- name: text
dtype: string
- name: paths
sequence: string
- name: type
dtype: string
splits:
- name: train
num_bytes: 2730767
num_examples: 523
download_size: 986939
dataset_size: 2730767
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kenobi/GeneLab_BPS_BenchmarkData | ---
task_categories:
- image-classification
- image-segmentation
- feature-extraction
- zero-shot-classification
- fill-mask
size_categories:
- 10K<n<100K
tags:
- biology
- cells
- radiation
- microscopy
- GeneLab
---
# Dataset Card for Dataset GeneLab_BPS_BenchmarkData
## Dataset Details
This dataset is a version of the Biological and Physical Sciences (BPS) Microscopy Benchmark Training Dataset managed by NASA and hosted on an S3 Bucket here: https://registry.opendata.aws/bps_microscopy/
Fluorescence microscopy images of individual nuclei from mouse fibroblast cells, irradiated with Fe particles or X-rays with fluorescent foci indicating 53BP1 positivity, a marker of DNA damage. These are maximum intensity projections of 9-layer microscopy Z-stacks.
### Dataset Description
- **Curated by:** Frank Soboczenski
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** There are no restrictions on the use of this data.
## Further Documentation
https://docs.google.com/document/d/e/2PACX-1vTIjUPenLxVX0stErsBbK884QMJW_Ur1mqHJ9K3KIZl3klT90cxHDppsEvz5Z6Skdu13X8tzghqyWcN/pub
### Update Frequency
New fluorescence microscopy mouse fibroblast nuclei data is added whenever it is available.
## Dataset Structure
GeneLab_BPS_BenchmarkData/<br>
├── README.md<br>
└── data/<br>
├── High_Energy_Ion_Fe_Nuclei/<br>
│ └── .tif<br>
│ └── ...<br>
└── XRay_irradiated_Nucleiation/<br>
└── .tif<br>
└── ...<br>
## How to Cite
Biological and Physical Sciences (BPS) Microscopy Benchmark Training Dataset was accessed on DATE from https://huggingface.co/datasets/kenobi/GeneLab_BPS_BenchmarkData
## Publications
- Dose, LET and Strain Dependence of Radiation-Induced 53BP1 Foci in 15 Mouse Strains Ex Vivo Introducing Novel DNA Damage Metrics by Sébastien Penninckx, Egle Cekanaviciute, Charlotte Degorre, Elodie Guiet, Louise Viger, Stéphane Lucasb, Sylvain V. Costes
- NASA SMD AI Workshop Report by SMD Artificial Intelligence (AI) Initiative
## Dataset Card Authors [optional]
Lauren Sanders (lauren.m.sanders@nasa.gov)
## Dataset Card Contact
Frank Soboczenski (Frank.Soboczenski@york.ac.uk) |
as674262040/zhangwenhe | ---
task_categories:
- text-generation
pretty_name: zhangwenhe
--- |
vwxyzjn/summarize_from_feedback_oai_preprocessing_1708445155 | ---
dataset_info:
features:
- name: info
struct:
- name: id
dtype: string
- name: post
dtype: string
- name: title
dtype: string
- name: subreddit
dtype: string
- name: site
dtype: string
- name: article
dtype: string
- name: summaries
list:
- name: text
dtype: string
- name: policy
dtype: string
- name: note
dtype: string
- name: choice
dtype: int32
- name: worker
dtype: string
- name: batch
dtype: string
- name: split
dtype: string
- name: extra
struct:
- name: confidence
dtype: int32
- name: query_token
sequence: int64
- name: query
dtype: string
- name: chosen
dtype: string
- name: chosen_token
sequence: int64
- name: chosen_token_len
dtype: int64
- name: rejected
dtype: string
- name: rejected_token
sequence: int64
- name: rejected_token_len
dtype: int64
- name: chosen_policy
dtype: string
- name: rejected_policy
dtype: string
- name: policies
dtype: string
- name: query_chosen
dtype: string
- name: query_chosen_token
sequence: int64
- name: query_chosen_token_len
dtype: int64
- name: query_rejected
dtype: string
- name: query_rejected_token
sequence: int64
- name: query_rejected_token_len
dtype: int64
- name: query_token_len
dtype: int64
- name: query_chosen_token_response_label
sequence: int64
- name: query_rejected_token_response_label
sequence: int64
splits:
- name: train
num_bytes: 2188710827
num_examples: 92858
- name: validation
num_bytes: 1987980815
num_examples: 83802
- name: validation_cnndm
num_bytes: 137467119
num_examples: 2284
download_size: 422920668
dataset_size: 4314158761
---
# Dataset Card for "summarize_from_feedback_oai_preprocessing_1708445155"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
systemk/opticsqa | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
struct:
- name: label
sequence: string
- name: text
sequence: string
- name: answer
dtype: string
- name: answer_text
dtype: string
splits:
- name: train
num_bytes: 131212
num_examples: 496
download_size: 62566
dataset_size: 131212
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
thobauma/harmless-poisoned-0.05-dollar-murder | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 58402939.44335993
num_examples: 42537
download_size: 31364075
dataset_size: 58402939.44335993
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Ramos-Ramos/smallnorb | ---
dataset_info:
features:
- name: image_lt
dtype: image
- name: image_rt
dtype: image
- name: category
dtype: int32
- name: instance
dtype: int32
- name: elevation
dtype: int32
- name: azimuth
dtype: int32
- name: lighting
dtype: int32
splits:
- name: train
num_bytes: 117947794.0
num_examples: 24300
- name: test
num_bytes: 118130266.0
num_examples: 24300
download_size: 236815224
dataset_size: 236078060.0
---
# Dataset Card for "smallnorb"
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
**NOTE:** This dataset is an unofficial port of small NORB based on a [repo from Andrea Palazzi](https://github.com/ndrplz/small_norb) using this [script](https://colab.research.google.com/drive/1Tx20uP1PrnyarsNCWf1dN9EQyr38BDIE?usp=sharing). For complete and accurate information, we highly recommend visiting the dataset's original homepage.
- **Homepage:** https://cs.nyu.edu/~ylclab/data/norb-v1.0-small/
- **Paper:** https://ieeexplore.ieee.org/document/1315150
### Dataset Summary
From the dataset's [homepage](https://cs.nyu.edu/~ylclab/data/norb-v1.0-small/):
> This database is intended for experiments in 3D object reocgnition from shape. It contains images of 50 toys belonging to 5 generic categories: four-legged animals, human figures, airplanes, trucks, and cars. The objects were imaged by two cameras under 6 lighting conditions, 9 elevations (30 to 70 degrees every 5 degrees), and 18 azimuths (0 to 340 every 20 degrees).
>
> The training set is composed of 5 instances of each category (instances 4, 6, 7, 8 and 9), and the test set of the remaining 5 instances (instances 0, 1, 2, 3, and 5).
## Dataset Structure
### Data Instances
An example of an instance in this dataset:
```
{
'image_lt': <PIL.PngImagePlugin.PngImageFile image mode=L size=96x96 at 0x...>,
'image_rt': <PIL.PngImagePlugin.PngImageFile image mode=L size=96x96 at 0x...>,
'category': 0,
'instance': 8,
'elevation': 6,
'azimuth': 4,
'lighting': 4
}
```
### Data Fields
Explanation of this dataset's fields:
- `image_lt`: a PIL image of an object from the dataset taken with one of two cameras
- `image_rt`: a PIL image of an object from the dataset taken with one of two cameras
- `category`: the category of the object shown in the images
- `instance`: the instance of the category of the object shown in the images
- `elevation`: the label of the elevation of the cameras used in capturing a picture of the object
- `azimuth`: the label of the azimuth of the cameras used in capturing a picture of the object
- `lighting`: the label of the lighting condition used in capturing a picture of the object
For more information on what these categories and labels pertain to, please see [Dataset Summary](#dataset-summary) or the [repo](https://github.com/ndrplz/small_norb) used in processing the dataset.
### Data Splits
Information on this dataset's splits:
| | train | test |
|------|------:|------:|
| size | 24300 | 24300 |
## Additional Information
### Dataset Curators
Credits from the dataset's [homepage](https://cs.nyu.edu/~ylclab/data/norb-v1.0-small/):
> [Fu Jie Huang](http://www.cs.nyu.edu/jhuangfu/), [Yann LeCun](http://yann.lecun.com/)
>
> Courant Institute, New York University
>
> October, 2005
### Licensing Information
From the dataset's [homepage](https://cs.nyu.edu/~ylclab/data/norb-v1.0-small/):
> This database is provided for research purposes. It cannot be sold. Publications that include results obtained with this database should reference the following paper:
>
> Y. LeCun, F.J. Huang, L. Bottou, Learning Methods for Generic Object Recognition with Invariance to Pose and Lighting. IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR) 2004
### Citation Information
From the dataset's [homepage](https://cs.nyu.edu/~ylclab/data/norb-v1.0-small/):
> Publications that include results obtained with this database should reference the following paper:
>
> Y. LeCun, F.J. Huang, L. Bottou, Learning Methods for Generic Object Recognition with Invariance to Pose and Lighting. IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR) 2004
```
@inproceedings{lecun2004learning,
title={Learning methods for generic object recognition with invariance to pose and lighting},
author={LeCun, Yann and Huang, Fu Jie and Bottou, Leon},
booktitle={Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004.},
volume={2},
pages={II--104},
year={2004},
organization={IEEE}
}
```
DOI: [10.1109/CVPR.2004.1315150](https://doi.org/10.1109/CVPR.2004.1315150)
### Contributions
Code to process small NORB adapted from [Andrea Palazzi's repo](https://github.com/ndrplz/small_norb) with this [script](https://colab.research.google.com/drive/1Tx20uP1PrnyarsNCWf1dN9EQyr38BDIE?usp=sharing). |
matlok/python-audio-copilot-training-using-function-knowledge-graphs | ---
license:
- other
pretty_name: >-
python copilot audio training using global functions with knowledge graphs
dataset_info:
- config_name: view_schema
splits:
- name: view_schema
configs:
- config_name: view_schema
data_files:
- split: view_schema
path: files/lok-python-copilot-audio.func-v1_00000095.parquet
size_categories:
- 10K<n<100K
tags:
- python-copilot
- python-coding
- python-architecture
- knowledge-graphs
- multimodal
- text-image-audio
- fine-tuning
- training
- question-answering
- image-knowledge-graph
- alpaca
- mp3
- png
- text
- instruct
- functions
- global-functions
# supported task_categories
# text-classification, token-classification, table-question-answering, question-answering, zero-shot-classification, translation, summarization, conversational, feature-extraction, text-generation, text2text-generation, fill-mask, sentence-similarity, text-to-speech, text-to-audio, automatic-speech-recognition, audio-to-audio, audio-classification, voice-activity-detection, depth-estimation, image-classification, object-detection, image-segmentation, text-to-image, image-to-text, image-to-image, image-to-video, unconditional-image-generation, video-classification, reinforcement-learning, robotics, tabular-classification, tabular-regression, tabular-to-text, table-to-text, multiple-choice, text-retrieval, time-series-forecasting, text-to-video, visual-question-answering, document-question-answering, zero-shot-image-classification, graph-ml, mask-generation, zero-shot-object-detection, text-to-3d, image-to-3d, other
task_categories:
- text-to-audio
- audio-to-audio
- question-answering
# supported task_ids
# acceptability-classification, entity-linking-classification, fact-checking, intent-classification, language-identification, multi-class-classification, multi-label-classification, multi-input-text-classification, natural-language-inference, semantic-similarity-classification, sentiment-classification, topic-classification, semantic-similarity-scoring, sentiment-scoring, sentiment-analysis, hate-speech-detection, text-scoring, named-entity-recognition, part-of-speech, parsing, lemmatization, word-sense-disambiguation, coreference-resolution, extractive-qa, open-domain-qa, closed-domain-qa, news-articles-summarization, news-articles-headline-generation, dialogue-generation, dialogue-modeling, language-modeling, text-simplification, explanation-generation, abstractive-qa, open-domain-abstractive-qa, closed-domain-qa, open-book-qa, closed-book-qa, slot-filling, masked-language-modeling, keyword-spotting, speaker-identification, audio-intent-classification, audio-emotion-recognition, audio-language-identification, multi-label-image-classification, multi-class-image-classification, face-detection, vehicle-detection, instance-segmentation, semantic-segmentation, panoptic-segmentation, image-captioning, image-inpainting, image-colorization, super-resolution, grasping, task-planning, tabular-multi-class-classification, tabular-multi-label-classification, tabular-single-column-regression, rdf-to-text, multiple-choice-qa, multiple-choice-coreference-resolution, document-retrieval, utterance-retrieval, entity-linking-retrieval, fact-checking-retrieval, univariate-time-series-forecasting, multivariate-time-series-forecasting, visual-question-answering, document-question-answering
task_ids:
- parsing
---
## Python Copilot Audio Training using Global Functions with Knowledge Graphs
This dataset is a subset of the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset.
### Details
Each global function has a question and answer mp3 where one voice reads the question and another voice reads the answer. Both mp3s are stored in the parquet **dbytes** column and the associated source code **file_path** identifier.
- Rows: 49910
- Size: 62.8 GB
- Data type: mp3
- Format: narrated alpaca question and answers using two voices
### Schema
```
{
"audio_path": "string",
"audio_type": "string",
"dbytes": "binary",
"dbytes_len": "int64",
"file_path": "string",
"file_path_len": "int64",
"lang": "string",
"lang_len": "int64",
"recsize": "int64"
}
```
### How to use the dataset
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-audio-copilot-training-using-functions-knowledge-graphs", data_dir="files")
```
|
pai2996/sd_celeb | ---
size_categories:
- n<1K
--- |
shidowake/cosmopedia-japanese-subset_from_aixsatoshi_filtered-sharegpt-format-with-system-prompt_split_5 | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 3990625.4590984974
num_examples: 499
download_size: 2417911
dataset_size: 3990625.4590984974
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.