datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
vwxyzjn/ultrachat_200k_filtered_1707919193 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_reference_response
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_reference_response_token
sequence: int64
- name: query_reference_response_token_len
dtype: int64
- name: query
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_token
sequence: int64
- name: query_token_len
dtype: int64
- name: reference_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: reference_response_token
sequence: int64
- name: reference_response_token_len
dtype: int64
splits:
- name: test_gen
num_bytes: 30484069
num_examples: 1000
- name: test_sft
num_bytes: 39592502
num_examples: 1000
- name: train_gen
num_bytes: 29613744
num_examples: 1000
- name: train_sft
num_bytes: 39521233
num_examples: 1000
download_size: 50859072
dataset_size: 139211548
---
# Args
```python
{'base_model': 'mistralai/Mistral-7B-v0.1',
'check_length_correctness': True,
'debug': True,
'hf_entity': 'vwxyzjn',
'params': TaskQueryHParams(length=3000,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding='pad_token',
pad_token=[32000],
pad_side='left',
max_sft_response_length=1500,
max_sft_query_response_length=4500,
max_rm_response_length=169,
max_rm_query_response_length=638),
'push_to_hub': True}
```
|
Zahra99/IEMOCAP_Text | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': ang
'1': hap
'2': neu
'3': sad
splits:
- name: session1
num_bytes: 71932
num_examples: 1085
- name: session2
num_bytes: 79012
num_examples: 1023
- name: session3
num_bytes: 74980
num_examples: 1151
- name: session4
num_bytes: 72622
num_examples: 1031
- name: session5
num_bytes: 89524
num_examples: 1241
download_size: 215486
dataset_size: 388070
task_categories:
- text-classification
language:
- en
tags:
- Emotion recognition
- Text classification
size_categories:
- 1K<n<10K
---
# Dataset Card for "IEMOCAP_Text"
This dataset obtained from IEMOCAP dataset. For more information go to [IEMOCAP](https://sail.usc.edu/iemocap/) webpage.
This dataset contains 5 most common classes includes angry, happy, excitement, neutral and sad. Based on articles in this field, we merge excitement and happy classes.
Our dataset contaions 5531 utterances and it splits based on the sessions.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_acrastt__OmegLLaMA-3B | ---
pretty_name: Evaluation run of acrastt/OmegLLaMA-3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [acrastt/OmegLLaMA-3B](https://huggingface.co/acrastt/OmegLLaMA-3B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_acrastt__OmegLLaMA-3B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T23:44:46.526914](https://huggingface.co/datasets/open-llm-leaderboard/details_acrastt__OmegLLaMA-3B/blob/main/results_2023-10-25T23-44-46.526914.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0009437919463087249,\n\
\ \"em_stderr\": 0.0003144653119413412,\n \"f1\": 0.051698825503355875,\n\
\ \"f1_stderr\": 0.0012873958872857685,\n \"acc\": 0.30934559138999973,\n\
\ \"acc_stderr\": 0.007489216852203227\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.0003144653119413412,\n\
\ \"f1\": 0.051698825503355875,\n \"f1_stderr\": 0.0012873958872857685\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \
\ \"acc_stderr\": 0.0013121578148674363\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6164167324388319,\n \"acc_stderr\": 0.013666275889539017\n\
\ }\n}\n```"
repo_url: https://huggingface.co/acrastt/OmegLLaMA-3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|arc:challenge|25_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T23_44_46.526914
path:
- '**/details_harness|drop|3_2023-10-25T23-44-46.526914.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T23-44-46.526914.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T23_44_46.526914
path:
- '**/details_harness|gsm8k|5_2023-10-25T23-44-46.526914.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T23-44-46.526914.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hellaswag|10_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:16:42.253337.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T05:16:42.253337.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T05:16:42.253337.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T23_44_46.526914
path:
- '**/details_harness|winogrande|5_2023-10-25T23-44-46.526914.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T23-44-46.526914.parquet'
- config_name: results
data_files:
- split: 2023_08_26T05_16_42.253337
path:
- results_2023-08-26T05:16:42.253337.parquet
- split: 2023_10_25T23_44_46.526914
path:
- results_2023-10-25T23-44-46.526914.parquet
- split: latest
path:
- results_2023-10-25T23-44-46.526914.parquet
---
# Dataset Card for Evaluation run of acrastt/OmegLLaMA-3B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/acrastt/OmegLLaMA-3B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [acrastt/OmegLLaMA-3B](https://huggingface.co/acrastt/OmegLLaMA-3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_acrastt__OmegLLaMA-3B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T23:44:46.526914](https://huggingface.co/datasets/open-llm-leaderboard/details_acrastt__OmegLLaMA-3B/blob/main/results_2023-10-25T23-44-46.526914.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413412,
"f1": 0.051698825503355875,
"f1_stderr": 0.0012873958872857685,
"acc": 0.30934559138999973,
"acc_stderr": 0.007489216852203227
},
"harness|drop|3": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413412,
"f1": 0.051698825503355875,
"f1_stderr": 0.0012873958872857685
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148674363
},
"harness|winogrande|5": {
"acc": 0.6164167324388319,
"acc_stderr": 0.013666275889539017
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
rntc/biomed-fr-v2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5526536498.256718
num_examples: 13988178
- name: validation
num_bytes: 55823708.74328186
num_examples: 141295
download_size: 3607078169
dataset_size: 5582360207.0
---
# Dataset Card for "biomed-fr-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FaalSa/dbscan3 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 14358960
num_examples: 180
- name: validation
num_bytes: 14445360
num_examples: 180
- name: test
num_bytes: 14531760
num_examples: 180
download_size: 14569900
dataset_size: 43336080
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
BevenRozario/FashionProduct | ---
dataset_info:
features:
- name: Instruction
dtype: string
- name: Response
dtype: string
splits:
- name: train_dataset
num_bytes: 1559880.0
num_examples: 4500
- name: eval_dataset
num_bytes: 173320.0
num_examples: 500
download_size: 337763
dataset_size: 1733200.0
configs:
- config_name: default
data_files:
- split: train_dataset
path: data/train_dataset-*
- split: eval_dataset
path: data/eval_dataset-*
---
|
PlanTL-GOB-ES/WikiCAT_en | ---
YAML tags:
annotations_creators:
- automatically-generated
language_creators:
- found
language:
- en
license:
- cc-by-sa-3.0
multilinguality:
- monolingual
pretty_name: wikicat_en
size_categories:
- unknown
source_datasets: []
task_categories:
- text-classification
task_ids:
- multi-class-classification
---
# WikiCAT_en (Text Classification) English dataset
## Dataset Description
- **Paper:**
- **Point of Contact:**
carlos.rodriguez1@bsc.es
**Repository**
https://github.com/TeMU-BSC/WikiCAT
### Dataset Summary
WikiCAT_en is a English corpus for thematic Text Classification tasks. It is created automatically from Wikipedia and Wikidata sources, and contains 28921 article summaries from the Wikiipedia classified under 19 different categories.
This dataset was developed by BSC TeMU as part of the PlanTL project, and intended as an evaluation of LT capabilities to generate useful synthetic corpus.
### Supported Tasks and Leaderboards
Text classification, Language Model
### Languages
EN - English
## Dataset Structure
### Data Instances
Two json files, one for each split.
### Data Fields
We used a simple model with the article text and associated labels, without further metadata.
#### Example:
<pre>
{"version": "1.1.0",
"data":
[
{
{'sentence': 'The IEEE Donald G. Fink Prize Paper Award was established in 1979 by the board of directors of the Institute of Electrical and Electronics Engineers (IEEE) in honor of Donald G. Fink. He was a past president of the Institute of Radio Engineers (IRE), and the first general manager and executive director of the IEEE. Recipients of this award received a certificate and an honorarium. The award was presented annually since 1981 and discontinued in 2016.', 'label': 'Engineering'
},
.
.
.
]
}
</pre>
#### Labels
'Health', 'Law', 'Entertainment', 'Religion', 'Business', 'Science', 'Engineering', 'Nature', 'Philosophy', 'Economy', 'Sports', 'Technology', 'Government', 'Mathematics', 'Military', 'Humanities', 'Music', 'Politics', 'History'
### Data Splits
* hftrain_en.json: 20237 label-document pairs
* hfeval_en.json: 8684 label-document pairs
## Dataset Creation
### Methodology
Se eligen páginas de partida “Category:” para representar los temas en cada lengua.
Se extrae para cada categoría las páginas principales, así como las subcategorías, y las páginas individuales bajo estas subcategorías de primer nivel.
Para cada página, se extrae también el “summary” que proporciona Wikipedia.
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
The source data are Wikipedia page summaries and thematic categories
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
Automatic annotation
### Personal and Sensitive Information
No personal or sensitive information included.
## Considerations for Using the Data
### Social Impact of Dataset
[N/A]
### Discussion of Biases
[N/A]
### Other Known Limitations
[N/A]
## Additional Information
### Dataset Curators
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@bsc.es).
For further information, send an email to (plantl-gob-es@bsc.es).
This work was funded by the [Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA)](https://avancedigital.mineco.gob.es/en-us/Paginas/index.aspx) within the framework of the [Plan-TL](https://plantl.mineco.gob.es/Paginas/index.aspx).
### Licensing information
This work is licensed under [CC Attribution 4.0 International](https://creativecommons.org/licenses/by/4.0/) License.
Copyright by the Spanish State Secretariat for Digitalization and Artificial Intelligence (SEDIA) (2022)
### Contributions
[N/A] |
youlun77/2000_TextClassification | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 147675.6
num_examples: 1800
- name: test
num_bytes: 16408.4
num_examples: 200
download_size: 74511
dataset_size: 164084.0
---
# Dataset Card for "2000_TextClassification"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
allenai/ValuePrism | ---
configs:
- config_name: full
data_files: full/*csv
default: true
- config_name: mixture
data_files:
- split: train
path: mixture/*train.csv
- split: val
path: mixture/*val.csv
- split: test
path: mixture/*test.csv
- config_name: generative
data_files:
- split: train
path: generative/*train.csv
- split: val
path: generative/*val.csv
- split: test
path: generative/*test.csv
- config_name: relevance
data_files:
- split: train
path: relevance/*train.csv
- split: val
path: relevance/*val.csv
- split: test
path: relevance/*test.csv
- config_name: explanation
data_files:
- split: train
path: explanation/*train.csv
- split: val
path: explanation/*val.csv
- split: test
path: explanation/*test.csv
- config_name: valence
data_files:
- split: train
path: valence/*train.csv
- split: val
path: valence/*val.csv
- split: test
path: valence/*test.csv
annotations_creators:
- crowdsourced: null
machine-generated: null
language:
- en
pretty_name: ValuePrism
extra_gated_prompt: >-
Access to this dataset is automatically granted upon accepting the [**AI2
ImpACT License - Medium Risk Artifacts (“MR
Agreement”)**](https://allenai.org/licenses/impact-mr) and completing all
fields below.
extra_gated_fields:
Your full name: text
Organization or entity you are affiliated with: text
State or country you are located in: text
Contact email: text
Please describe your intended use of the medium risk artifact(s): text
I UNDERSTAND that the dataset is intended for research purposes and not for real-world use-cases: checkbox
I AGREE to the terms and conditions of the MR Agreement above: checkbox
I AGREE to AI2’s use of my information for legal notices and administrative matters: checkbox
I CERTIFY that the information I have provided is true and accurate: checkbox
tags:
- not-for-all-audiences
size_categories:
- 100K<n<1M
---
# Dataset Card for ValuePrism
## Dataset Description
- **Paper:** https://arxiv.org/abs/2309.00779
- **Demo:** https://kaleido.allen.ai
- **Repository:** https://github.com/tsor13/kaleido
- **Datasheet for Datasets:** https://drive.google.com/file/d/1zDWvO0NljqxBMfDAGW7Jx60Iw54bjsEE/view?usp=sharing
- **License:** https://allenai.org/licenses/impact-mr
- **Point of Contact:** [Taylor Sorensen](mailto:tsor13@cs.washington.edu)
### Dataset Summary
ValuePrism was created 1) to understand what pluralistic human values, rights, and duties are already present in large language models, and 2) to serve as a resource to to support open, value pluralistic modeling (e.g., [Kaleido](https://huggingface.co/tsor13/kaleido-xl)). It contains human-written situations and machine-generated candidate values, rights, duties, along with their valences and post-hoc explanations relating them to the situations.
For additional documentation, see ValuePrism's [Datasheet](https://drive.google.com/file/d/1zDWvO0NljqxBMfDAGW7Jx60Iw54bjsEE/view?usp=sharing).
The dataset was created and intended for research purposes. It is openly released under AI2’s ImpACT license as a medium risk artifact.
### Supported Tasks
The dataset supports 4 tasks:
- **Generation (open-text)**
*What values, rights, and duties are relevant for a situation?*
Generate a value, right, or duty
that could be considered when reasoning about the action. Values are generated one at a time, as opposed to a batch.
- **Relevance (2-way classification)**
*Is a value relevant for a situation?* Some values are more relevant than others.
- **Valence (3-way classification)**
*Does the value support or oppose the action, or might it depend on context?*
Disentangling the valence is critical for understanding how plural considerations may interact with a decision.
- **Explanation (open-text)**
*How does the value relate to the action?* Generating a post-hoc rationale for why a value consideration may relate to a situation.
### Languages
All data is in English.
## Dataset Structure
### Dataset Splits
There are 6 data configurations:
- `full`: The full structured dataset of situations paired with values, rights, and duties paired with GPT-4. Only one split with all of the data.
- `generative`: Generative task train, val, and test splits.
- `relevance`: Relevance task train, val, and test splits.
- `valence`: Valence task train, val, and test splits.
- `explanation`: Explanation task train, val, and test splits.
- `mixture`: Generative, relevance, valence, and explanation tasks combined wtih train, val, and test splits.
### Data Fields
While different configurations have different fields, these are all the corresponding fields in the dataset:
- `situation` (string): A one sentence of a particular scenario or situation. For example, "buying some chocolate for my grandparents".
- `vrd` (string): Type of instance, either "Value", "Right", or "Duty".
- `text` (string): The text of the value, right, or duty. For example, "Honesty", "Right to property", "Duty to protect".
- `explanation` (string): A post-hoc explanation of why the specified value, right, or duty is relevant or important in the given situation. For example, "Buying chocolate for your grandparents can strengthen family connections and show appreciation for your relationship with them."
- `valence` (string): Indicates whether the value, right, or duty supports or opposes the action in the situation, or if it might depend on the context. Either "Supports", "Opposes", or "Either".
- `input` (string): For the seq2seq task (generative, relevance, valence, explanation), the input to the model.
- `output` (string): For the seq2seq task (generative, relevance, valence, explanation), the output of the model.
### Data Splits
All configurations (except for the raw outputs in `full`) have 80%/10%/10% train/validation/test splits.
## Dataset Creation
### Source Data
#### Data Collection
Situations are sourced from the Delphi user demo, and candidate values, rights, duties, their valences, and explanations connecting them to the situations are machine generated by GPT-4.
#### Who are the source language producers?
The situations are sourced from users of the Delphi user demo, for whom we do not have demographic information.
### Personal and Sensitive Information
There is no personal or sensitive information in ValuePrism.
## Considerations for Using the Data
### Social Impact of Dataset
We intend the dataset to be used to enable research and not to be used for real-world use or decision-making.
### Discussion of Biases
The value, right, and duty data was generated by GPT-4, which is known to exhibit [biases](https://arxiv.org/pdf/2304.03738.pdf). Thus, we expect ValuePrism to inherit biases from GPT-4.
That being said, we have tried to prompt the model to output a diversity of values in an attempt to mitigate bias with breadth.
## Additional Information
91% of values, rights, and duties were marked as high-quality by 3/3 annotators, and 87% of valence scores were marked as correct by 3/3 annotators.
Additionally, we perform a human study on the data and do not find large disparities in agreement between demographic groups tested, although future work in this area is a promising direction.
See [our paper] for more details and analysis.
### Licensing Information
ValuePrism is made available under the [**AI2
ImpACT License - Medium Risk Artifacts (“MR
Agreement”)**](https://allenai.org/licenses/impact-mr)
### Citation Information
Please cite [our paper](https://arxiv.org/abs/2309.00779) when using this dataset:
```
@misc{sorensen2023value,
title={Value Kaleidoscope: Engaging AI with Pluralistic Human Values, Rights, and Duties},
author={Taylor Sorensen and Liwei Jiang and Jena Hwang and Sydney Levine and Valentina Pyatkin and Peter West and Nouha Dziri and Ximing Lu and Kavel Rao and Chandra Bhagavatula and Maarten Sap and John Tasioulas and Yejin Choi},
year={2023},
eprint={2309.00779},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
#### Raw Dataset Statistics
The total, number of unique, and average number of generated values, rights, and duties per situation are shown.
| **Type** | **Total** | **Unique** | **Per Situation** |
|--------------|-----------|------------|--------------------|
| **Situations** | 31.0k | 31.0k | 1 |
| **Values** | 97.7k | 4.2k | 3.15 |
| **Rights** | 49.0k | 4.6k | 1.58 |
| **Duties** | 71.6k | 12.8k | 2.31 |
#### Task Dataset Statistics
| | **Relevance** | **Valence** | **Generation** | **Explanation** | **Mixture** |
|---------------|------------|-------------|----------|-----------|-------------|
| **Train** | 349k | 175k | 175k | 175k | 874k |
| **Val** | 44k | 22k | 22k | 22k | 109k |
| **Test** | 44k | 22k | 22k | 22k | 109k |
| **Total** | 437k | 219k | 219k | 219k | 1.1M | |
Multimodal-Fatima/Caltech101_with_background_test_facebook_opt_350m_Attributes_Caption_ns_6084_random | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_1_bs_16
num_bytes: 102753751.5
num_examples: 6084
- name: fewshot_3_bs_16
num_bytes: 105997404.5
num_examples: 6084
download_size: 201172569
dataset_size: 208751156.0
---
# Dataset Card for "Caltech101_with_background_test_facebook_opt_350m_Attributes_Caption_ns_6084_random"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Technoculture__PMCorca-2x13b | ---
pretty_name: Evaluation run of Technoculture/PMCorca-2x13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Technoculture/PMCorca-2x13b](https://huggingface.co/Technoculture/PMCorca-2x13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Technoculture__PMCorca-2x13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T00:38:33.372199](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__PMCorca-2x13b/blob/main/results_2024-02-10T00-38-33.372199.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.232157473965765,\n\
\ \"acc_stderr\": 0.029934682640696163,\n \"acc_norm\": 0.23234243360853526,\n\
\ \"acc_norm_stderr\": 0.03072505746202978,\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.01497482727975234,\n \"mc2\": 0.49715720852516543,\n\
\ \"mc2_stderr\": 0.017018403903011948\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.22440273037542663,\n \"acc_stderr\": 0.012191404938603843,\n\
\ \"acc_norm\": 0.2721843003412969,\n \"acc_norm_stderr\": 0.013006600406423707\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25941047600079664,\n\
\ \"acc_stderr\": 0.004374153847826759,\n \"acc_norm\": 0.25941047600079664,\n\
\ \"acc_norm_stderr\": 0.004374153847826759\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.24112607099143207,\n \"mc1_stderr\": 0.01497482727975234,\n\
\ \"mc2\": 0.49715720852516543,\n \"mc2_stderr\": 0.017018403903011948\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.5011838989739542,\n\
\ \"acc_stderr\": 0.014052446290529019\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Technoculture/PMCorca-2x13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|arc:challenge|25_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|gsm8k|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hellaswag|10_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-38-33.372199.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T00-38-33.372199.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- '**/details_harness|winogrande|5_2024-02-10T00-38-33.372199.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T00-38-33.372199.parquet'
- config_name: results
data_files:
- split: 2024_02_10T00_38_33.372199
path:
- results_2024-02-10T00-38-33.372199.parquet
- split: latest
path:
- results_2024-02-10T00-38-33.372199.parquet
---
# Dataset Card for Evaluation run of Technoculture/PMCorca-2x13b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Technoculture/PMCorca-2x13b](https://huggingface.co/Technoculture/PMCorca-2x13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Technoculture__PMCorca-2x13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T00:38:33.372199](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__PMCorca-2x13b/blob/main/results_2024-02-10T00-38-33.372199.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.232157473965765,
"acc_stderr": 0.029934682640696163,
"acc_norm": 0.23234243360853526,
"acc_norm_stderr": 0.03072505746202978,
"mc1": 0.24112607099143207,
"mc1_stderr": 0.01497482727975234,
"mc2": 0.49715720852516543,
"mc2_stderr": 0.017018403903011948
},
"harness|arc:challenge|25": {
"acc": 0.22440273037542663,
"acc_stderr": 0.012191404938603843,
"acc_norm": 0.2721843003412969,
"acc_norm_stderr": 0.013006600406423707
},
"harness|hellaswag|10": {
"acc": 0.25941047600079664,
"acc_stderr": 0.004374153847826759,
"acc_norm": 0.25941047600079664,
"acc_norm_stderr": 0.004374153847826759
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24112607099143207,
"mc1_stderr": 0.01497482727975234,
"mc2": 0.49715720852516543,
"mc2_stderr": 0.017018403903011948
},
"harness|winogrande|5": {
"acc": 0.5011838989739542,
"acc_stderr": 0.014052446290529019
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/ronye_arabel_swordartonline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ronye_arabel (Sword Art Online)
This is the dataset of ronye_arabel (Sword Art Online), containing 38 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
cestwc/FLD_1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: hypothesis
dtype: string
- name: context
dtype: string
- name: hypothesis_formula
dtype: string
- name: context_formula
dtype: string
- name: proofs
sequence: string
- name: proof_label
dtype: string
- name: proofs_formula
sequence: string
- name: world_assump_label
dtype: string
- name: original_tree_depth
dtype: int64
- name: depth
dtype: int64
- name: num_formula_distractors
dtype: int64
- name: num_translation_distractors
dtype: int64
- name: num_all_distractors
dtype: int64
- name: negative_hypothesis
dtype: string
- name: negative_hypothesis_formula
dtype: string
- name: negative_original_tree_depth
dtype: int64
- name: negative_proofs
sequence: string
- name: negative_proof_label
dtype: string
- name: negative_world_assump_label
dtype: string
- name: prompt_serial
dtype: string
- name: proof_serial
dtype: string
- name: version
dtype: string
- name: premise
dtype: string
- name: assumptions
sequence: string
- name: paraphrased_premises
sequence: string
splits:
- name: train
num_bytes: 111376663
num_examples: 30000
- name: validation
num_bytes: 18529236
num_examples: 5000
- name: test
num_bytes: 18529220
num_examples: 5000
download_size: 56252409
dataset_size: 148435119
---
# Dataset Card for "FLD_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RobertLau/decoder_json | ---
license: openrail
---
|
distilled-from-one-sec-cv12/chunk_74 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1284303528
num_examples: 250254
download_size: 1310863681
dataset_size: 1284303528
---
# Dataset Card for "chunk_74"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/sst2_affix | ---
dataset_info:
features:
- name: idx
dtype: int32
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': positive
- name: words_with_affixes
sequence: string
splits:
- name: validation
num_bytes: 22640
num_examples: 146
download_size: 19044
dataset_size: 22640
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "sst2_affix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MicPie/unpredictable_cluster27 | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: UnpredicTable-cluster27
size_categories:
- 100K<n<1M
source_datasets: []
task_categories:
- multiple-choice
- question-answering
- zero-shot-classification
- text2text-generation
- table-question-answering
- text-generation
- text-classification
- tabular-classification
task_ids:
- multiple-choice-qa
- extractive-qa
- open-domain-qa
- closed-domain-qa
- closed-book-qa
- open-book-qa
- language-modeling
- multi-class-classification
- natural-language-inference
- topic-classification
- multi-label-classification
- tabular-multi-class-classification
- tabular-multi-label-classification
---
# Dataset Card for "UnpredicTable-cluster27" - Dataset of Few-shot Tasks from Tables
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://ethanperez.net/unpredictable
- **Repository:** https://github.com/JunShern/few-shot-adaptation
- **Paper:** Few-shot Adaptation Works with UnpredicTable Data
- **Point of Contact:** junshern@nyu.edu, perez@nyu.edu
### Dataset Summary
The UnpredicTable dataset consists of web tables formatted as few-shot tasks for fine-tuning language models to improve their few-shot performance.
There are several dataset versions available:
* [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full): Starting from the initial WTC corpus of 50M tables, we apply our tables-to-tasks procedure to produce our resulting dataset, [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full), which comprises 413,299 tasks from 23,744 unique websites.
* [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique): This is the same as [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full) but filtered to have a maximum of one task per website. [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique) contains exactly 23,744 tasks from 23,744 websites.
* [UnpredicTable-5k](https://huggingface.co/datasets/MicPie/unpredictable_5k): This dataset contains 5k random tables from the full dataset.
* UnpredicTable data subsets based on a manual human quality rating (please see our publication for details of the ratings):
* [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low)
* [UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium)
* [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high)
* UnpredicTable data subsets based on the website of origin:
* [UnpredicTable-baseball-fantasysports-yahoo-com](https://huggingface.co/datasets/MicPie/unpredictable_baseball-fantasysports-yahoo-com)
* [UnpredicTable-bulbapedia-bulbagarden-net](https://huggingface.co/datasets/MicPie/unpredictable_bulbapedia-bulbagarden-net)
* [UnpredicTable-cappex-com](https://huggingface.co/datasets/MicPie/unpredictable_cappex-com)
* [UnpredicTable-cram-com](https://huggingface.co/datasets/MicPie/unpredictable_cram-com)
* [UnpredicTable-dividend-com](https://huggingface.co/datasets/MicPie/unpredictable_dividend-com)
* [UnpredicTable-dummies-com](https://huggingface.co/datasets/MicPie/unpredictable_dummies-com)
* [UnpredicTable-en-wikipedia-org](https://huggingface.co/datasets/MicPie/unpredictable_en-wikipedia-org)
* [UnpredicTable-ensembl-org](https://huggingface.co/datasets/MicPie/unpredictable_ensembl-org)
* [UnpredicTable-gamefaqs-com](https://huggingface.co/datasets/MicPie/unpredictable_gamefaqs-com)
* [UnpredicTable-mgoblog-com](https://huggingface.co/datasets/MicPie/unpredictable_mgoblog-com)
* [UnpredicTable-mmo-champion-com](https://huggingface.co/datasets/MicPie/unpredictable_mmo-champion-com)
* [UnpredicTable-msdn-microsoft-com](https://huggingface.co/datasets/MicPie/unpredictable_msdn-microsoft-com)
* [UnpredicTable-phonearena-com](https://huggingface.co/datasets/MicPie/unpredictable_phonearena-com)
* [UnpredicTable-sittercity-com](https://huggingface.co/datasets/MicPie/unpredictable_sittercity-com)
* [UnpredicTable-sporcle-com](https://huggingface.co/datasets/MicPie/unpredictable_sporcle-com)
* [UnpredicTable-studystack-com](https://huggingface.co/datasets/MicPie/unpredictable_studystack-com)
* [UnpredicTable-support-google-com](https://huggingface.co/datasets/MicPie/unpredictable_support-google-com)
* [UnpredicTable-w3-org](https://huggingface.co/datasets/MicPie/unpredictable_w3-org)
* [UnpredicTable-wiki-openmoko-org](https://huggingface.co/datasets/MicPie/unpredictable_wiki-openmoko-org)
* [UnpredicTable-wkdu-org](https://huggingface.co/datasets/MicPie/unpredictable_wkdu-org)
* UnpredicTable data subsets based on clustering (for the clustering details please see our publication):
* [UnpredicTable-cluster00](https://huggingface.co/datasets/MicPie/unpredictable_cluster00)
* [UnpredicTable-cluster01](https://huggingface.co/datasets/MicPie/unpredictable_cluster01)
* [UnpredicTable-cluster02](https://huggingface.co/datasets/MicPie/unpredictable_cluster02)
* [UnpredicTable-cluster03](https://huggingface.co/datasets/MicPie/unpredictable_cluster03)
* [UnpredicTable-cluster04](https://huggingface.co/datasets/MicPie/unpredictable_cluster04)
* [UnpredicTable-cluster05](https://huggingface.co/datasets/MicPie/unpredictable_cluster05)
* [UnpredicTable-cluster06](https://huggingface.co/datasets/MicPie/unpredictable_cluster06)
* [UnpredicTable-cluster07](https://huggingface.co/datasets/MicPie/unpredictable_cluster07)
* [UnpredicTable-cluster08](https://huggingface.co/datasets/MicPie/unpredictable_cluster08)
* [UnpredicTable-cluster09](https://huggingface.co/datasets/MicPie/unpredictable_cluster09)
* [UnpredicTable-cluster10](https://huggingface.co/datasets/MicPie/unpredictable_cluster10)
* [UnpredicTable-cluster11](https://huggingface.co/datasets/MicPie/unpredictable_cluster11)
* [UnpredicTable-cluster12](https://huggingface.co/datasets/MicPie/unpredictable_cluster12)
* [UnpredicTable-cluster13](https://huggingface.co/datasets/MicPie/unpredictable_cluster13)
* [UnpredicTable-cluster14](https://huggingface.co/datasets/MicPie/unpredictable_cluster14)
* [UnpredicTable-cluster15](https://huggingface.co/datasets/MicPie/unpredictable_cluster15)
* [UnpredicTable-cluster16](https://huggingface.co/datasets/MicPie/unpredictable_cluster16)
* [UnpredicTable-cluster17](https://huggingface.co/datasets/MicPie/unpredictable_cluster17)
* [UnpredicTable-cluster18](https://huggingface.co/datasets/MicPie/unpredictable_cluster18)
* [UnpredicTable-cluster19](https://huggingface.co/datasets/MicPie/unpredictable_cluster19)
* [UnpredicTable-cluster20](https://huggingface.co/datasets/MicPie/unpredictable_cluster20)
* [UnpredicTable-cluster21](https://huggingface.co/datasets/MicPie/unpredictable_cluster21)
* [UnpredicTable-cluster22](https://huggingface.co/datasets/MicPie/unpredictable_cluster22)
* [UnpredicTable-cluster23](https://huggingface.co/datasets/MicPie/unpredictable_cluster23)
* [UnpredicTable-cluster24](https://huggingface.co/datasets/MicPie/unpredictable_cluster24)
* [UnpredicTable-cluster25](https://huggingface.co/datasets/MicPie/unpredictable_cluster25)
* [UnpredicTable-cluster26](https://huggingface.co/datasets/MicPie/unpredictable_cluster26)
* [UnpredicTable-cluster27](https://huggingface.co/datasets/MicPie/unpredictable_cluster27)
* [UnpredicTable-cluster28](https://huggingface.co/datasets/MicPie/unpredictable_cluster28)
* [UnpredicTable-cluster29](https://huggingface.co/datasets/MicPie/unpredictable_cluster29)
* [UnpredicTable-cluster-noise](https://huggingface.co/datasets/MicPie/unpredictable_cluster-noise)
### Supported Tasks and Leaderboards
Since the tables come from the web, the distribution of tasks and topics is very broad. The shape of our dataset is very wide, i.e., we have 1000's of tasks, while each task has only a few examples, compared to most current NLP datasets which are very deep, i.e., 10s of tasks with many examples. This implies that our dataset covers a broad range of potential tasks, e.g., multiple-choice, question-answering, table-question-answering, text-classification, etc.
The intended use of this dataset is to improve few-shot performance by fine-tuning/pre-training on our dataset.
### Languages
English
## Dataset Structure
### Data Instances
Each task is represented as a jsonline file and consists of several few-shot examples. Each example is a dictionary containing a field 'task', which identifies the task, followed by an 'input', 'options', and 'output' field. The 'input' field contains several column elements of the same row in the table, while the 'output' field is a target which represents an individual column of the same row. Each task contains several such examples which can be concatenated as a few-shot task. In the case of multiple choice classification, the 'options' field contains the possible classes that a model needs to choose from.
There are also additional meta-data fields such as 'pageTitle', 'title', 'outputColName', 'url', 'wdcFile'.
### Data Fields
'task': task identifier
'input': column elements of a specific row in the table.
'options': for multiple choice classification, it provides the options to choose from.
'output': target column element of the same row as input.
'pageTitle': the title of the page containing the table.
'outputColName': output column name
'url': url to the website containing the table
'wdcFile': WDC Web Table Corpus file
### Data Splits
The UnpredicTable datasets do not come with additional data splits.
## Dataset Creation
### Curation Rationale
Few-shot training on multi-task datasets has been demonstrated to improve language models' few-shot learning (FSL) performance on new tasks, but it is unclear which training tasks lead to effective downstream task adaptation. Few-shot learning datasets are typically produced with expensive human curation, limiting the scale and diversity of the training tasks available to study. As an alternative source of few-shot data, we automatically extract 413,299 tasks from diverse internet tables. We provide this as a research resource to investigate the relationship between training data and few-shot learning.
### Source Data
#### Initial Data Collection and Normalization
We use internet tables from the English-language Relational Subset of the WDC Web Table Corpus 2015 (WTC). The WTC dataset tables were extracted from the July 2015 Common Crawl web corpus (http://webdatacommons.org/webtables/2015/EnglishStatistics.html). The dataset contains 50,820,165 tables from 323,160 web domains. We then convert the tables into few-shot learning tasks. Please see our publication for more details on the data collection and conversion pipeline.
#### Who are the source language producers?
The dataset is extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/).
### Annotations
#### Annotation process
Manual annotation was only carried out for the [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low),
[UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium), and [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high) data subsets to rate task quality. Detailed instructions of the annotation instructions can be found in our publication.
#### Who are the annotators?
Annotations were carried out by a lab assistant.
### Personal and Sensitive Information
The data was extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/), which in turn extracted tables from the [Common Crawl](https://commoncrawl.org/). We did not filter the data in any way. Thus any user identities or otherwise sensitive information (e.g., data that reveals racial or ethnic origins, sexual orientations, religious beliefs, political opinions or union memberships, or locations; financial or health data; biometric or genetic data; forms of government identification, such as social security numbers; criminal history, etc.) might be contained in our dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended for use as a research resource to investigate the relationship between training data and few-shot learning. As such, it contains high- and low-quality data, as well as diverse content that may be untruthful or inappropriate. Without careful investigation, it should not be used for training models that will be deployed for use in decision-critical or user-facing situations.
### Discussion of Biases
Since our dataset contains tables that are scraped from the web, it will also contain many toxic, racist, sexist, and otherwise harmful biases and texts. We have not run any analysis on the biases prevalent in our datasets. Neither have we explicitly filtered the content. This implies that a model trained on our dataset may potentially reflect harmful biases and toxic text that exist in our dataset.
### Other Known Limitations
No additional known limitations.
## Additional Information
### Dataset Curators
Jun Shern Chan, Michael Pieler, Jonathan Jao, Jérémy Scheurer, Ethan Perez
### Licensing Information
Apache 2.0
### Citation Information
```
@misc{chan2022few,
author = {Chan, Jun Shern and Pieler, Michael and Jao, Jonathan and Scheurer, Jérémy and Perez, Ethan},
title = {Few-shot Adaptation Works with UnpredicTable Data},
publisher={arXiv},
year = {2022},
url = {https://arxiv.org/abs/2208.01009}
}
```
|
autoevaluate/autoeval-staging-eval-emotion-default-73111d-17386348 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: lewtun/sagemaker-distilbert-emotion
metrics: []
dataset_name: emotion
dataset_config: default
dataset_split: test
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: lewtun/sagemaker-distilbert-emotion
* Dataset: emotion
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
kowndinya23/Kvasir-SEG | ---
dataset_info:
features:
- name: name
dtype: string
- name: image
dtype: image
- name: annotation
dtype: image
splits:
- name: train
num_bytes: 36829616.0
num_examples: 880
- name: validation
num_bytes: 8018441.0
num_examples: 120
download_size: 44672597
dataset_size: 44848057.0
---
# Dataset Card for "Kvasir-SEG"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kiriyamaX/mini-bench | ---
license: gpl-3.0
---
|
allenai/ms2_sparse_oracle | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|other-MS^2
- extended|other-Cochrane
task_categories:
- summarization
- text2text-generation
paperswithcode_id: multi-document-summarization
pretty_name: MSLR Shared Task
---
This is a copy of the [MS^2](https://huggingface.co/datasets/allenai/mslr2022) dataset, except the input source documents of its `validation` split have been replaced by a __sparse__ retriever. The retrieval pipeline used:
- __query__: The `background` field of each example
- __corpus__: The union of all documents in the `train`, `validation` and `test` splits. A document is the concatenation of the `title` and `abstract`.
- __retriever__: BM25 via [PyTerrier](https://pyterrier.readthedocs.io/en/latest/) with default settings
- __top-k strategy__: `"oracle"`, i.e. the number of documents retrieved, `k`, is set as the original number of input documents for each example
Retrieval results on the `train` set:
| Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.4333 | 0.2163 | 0.2163 | 0.2163 |
Retrieval results on the `validation` set:
| Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.3780 | 0.1827 | 0.1827 | 0.1827 |
Retrieval results on the `test` set:
| Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.3928 | 0.1898 | 0.1898 | 0.1898 | |
CyberHarem/prinz_eugen_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of prinz_eugen/プリンツ・オイゲン/欧根亲王 (Azur Lane)
This is the dataset of prinz_eugen/プリンツ・オイゲン/欧根亲王 (Azur Lane), containing 500 images and their tags.
The core tags of this character are `long_hair, multicolored_hair, streaked_hair, breasts, red_hair, bangs, two_side_up, large_breasts, white_hair, very_long_hair, antenna_hair, mole, mole_on_breast, hair_between_eyes, headgear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 943.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/prinz_eugen_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 455.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/prinz_eugen_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1369 | 1.05 GiB | [Download](https://huggingface.co/datasets/CyberHarem/prinz_eugen_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 798.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/prinz_eugen_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1369 | 1.61 GiB | [Download](https://huggingface.co/datasets/CyberHarem/prinz_eugen_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/prinz_eugen_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, looking_at_viewer, solo, black_gloves, blush, brown_eyes, hair_bow, simple_background, thighs, white_background, barefoot, sitting |
| 1 | 6 |  |  |  |  |  | 1girl, bare_shoulders, black_gloves, cleavage, looking_at_viewer, maid_headdress, official_alternate_costume, solo, black_thighhighs, blush, elbow_gloves, black_dress, navel, smile, cross, red_eyes, thigh_strap |
| 2 | 6 |  |  |  |  |  | 1girl, black_gloves, blush, garter_straps, iron_cross, long_sleeves, looking_at_viewer, sideboob, solo, white_background, machinery, rigging, simple_background, swept_bangs, thighs, finger_to_mouth, grey_hair, ass, black_thighhighs, brown_eyes, orange_eyes |
| 3 | 14 |  |  |  |  |  | 1girl, iron_cross, long_sleeves, looking_at_viewer, solo, black_gloves, sideboob, simple_background, white_background, garter_straps, grey_hair, orange_eyes, smile, closed_mouth, rigging, sitting, black_thighhighs |
| 4 | 9 |  |  |  |  |  | 1girl, black_thighhighs, blush, iron_cross, kimono, looking_at_viewer, solo, wide_sleeves, black_panties, sideboob, thighs, long_sleeves, sakazuki, choker, holding, obi, open_mouth, side-tie_panties, smile, alcohol, simple_background, swept_bangs, white_background, collarbone, nail_polish, pelvic_curtain |
| 5 | 6 |  |  |  |  |  | 1girl, bare_shoulders, blush, bridal_gauntlets, bridal_veil, cleavage, looking_at_viewer, see-through, smile, solo, wedding_dress, white_dress, white_gloves, flower, grey_hair, official_alternate_costume, sitting, thighs, white_thighhighs, brown_eyes, closed_mouth, finger_to_mouth, garter_straps, petals |
| 6 | 10 |  |  |  |  |  | 1girl, cropped_jacket, cropped_shirt, looking_at_viewer, navel, official_alternate_costume, open_jacket, race_queen, red_panties, solo, yellow_eyes, black_gloves, half_gloves, thighhighs, two-tone_skirt, panty_straps, miniskirt, purple_jacket, underboob_cutout, white_belt, white_skirt, checkered_flag, earpiece, black_jacket, twintails, blue_sky, highleg, holding_flag, simple_background, sitting, smile, standing, stomach, thigh_boots, two-tone_hair, white_background |
| 7 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, nurse_cap, short_sleeves, solo, cross, open_mouth, simple_background, white_background, white_dress, white_gloves, white_thighhighs, sideboob, cleavage, holding_syringe, orange_eyes, smile, swept_bangs, thighs, white_headwear |
| 8 | 73 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, hair_ribbon, black_bikini, black_ribbon, cleavage, navel, side-tie_bikini_bottom, black_choker, german_flag_bikini, collarbone, bare_shoulders, thigh_strap, official_alternate_costume, smile, brown_eyes, grey_hair, thighs, day, outdoors, cross, earrings, simple_background, tongue_out |
| 9 | 9 |  |  |  |  |  | 1girl, looking_at_viewer, playboy_bunny, solo, bare_shoulders, blush, detached_collar, strapless_leotard, black_leotard, cleavage, pantyhose, rabbit_ears, wrist_cuffs, bowtie, fake_animal_ears, simple_background, iron_cross, ribbon, smile, white_background, covered_navel, holding |
| 10 | 6 |  |  |  |  |  | 1girl, black_skirt, holding, looking_at_viewer, simple_background, solo, pleated_skirt, school_uniform, white_background, white_shirt, blush, full_body, alternate_costume, cross, earrings, long_sleeves, ribbon, school_bag, standing |
| 11 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_apron, blush, enmaided, puffy_short_sleeves, cup, frilled_apron, maid_headdress, orange_eyes, waist_apron, animal_ears, black_dress, black_thighhighs, bow, brown_eyes, closed_mouth, food, holding_tray, maid_apron, open_mouth, parted_bangs, standing, wrist_cuffs |
| 12 | 7 |  |  |  |  |  | blush, 1girl, hetero, solo_focus, tongue_out, 1boy, open_mouth, facial, heart-shaped_pupils, yellow_eyes, cum_in_mouth, cum_on_breasts, cum_on_hair, erection, horns, licking_penis, uncensored |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_dress | looking_at_viewer | solo | black_gloves | blush | brown_eyes | hair_bow | simple_background | thighs | white_background | barefoot | sitting | cleavage | maid_headdress | official_alternate_costume | black_thighhighs | elbow_gloves | navel | smile | cross | red_eyes | thigh_strap | garter_straps | iron_cross | long_sleeves | sideboob | machinery | rigging | swept_bangs | finger_to_mouth | grey_hair | ass | orange_eyes | closed_mouth | kimono | wide_sleeves | black_panties | sakazuki | choker | holding | obi | open_mouth | side-tie_panties | alcohol | collarbone | nail_polish | pelvic_curtain | bridal_gauntlets | bridal_veil | see-through | wedding_dress | white_dress | white_gloves | flower | white_thighhighs | petals | cropped_jacket | cropped_shirt | open_jacket | race_queen | red_panties | yellow_eyes | half_gloves | thighhighs | two-tone_skirt | panty_straps | miniskirt | purple_jacket | underboob_cutout | white_belt | white_skirt | checkered_flag | earpiece | black_jacket | twintails | blue_sky | highleg | holding_flag | standing | stomach | thigh_boots | two-tone_hair | nurse_cap | short_sleeves | holding_syringe | white_headwear | hair_ribbon | black_bikini | black_ribbon | side-tie_bikini_bottom | black_choker | german_flag_bikini | day | outdoors | earrings | tongue_out | playboy_bunny | detached_collar | strapless_leotard | black_leotard | pantyhose | rabbit_ears | wrist_cuffs | bowtie | fake_animal_ears | ribbon | covered_navel | black_skirt | pleated_skirt | school_uniform | white_shirt | full_body | alternate_costume | school_bag | white_apron | enmaided | puffy_short_sleeves | cup | frilled_apron | waist_apron | animal_ears | bow | food | holding_tray | maid_apron | parted_bangs | hetero | solo_focus | 1boy | facial | heart-shaped_pupils | cum_in_mouth | cum_on_breasts | cum_on_hair | erection | horns | licking_penis | uncensored |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-----------------|:--------------|:--------------------|:-------|:---------------|:--------|:-------------|:-----------|:--------------------|:---------|:-------------------|:-----------|:----------|:-----------|:-----------------|:-----------------------------|:-------------------|:---------------|:--------|:--------|:--------|:-----------|:--------------|:----------------|:-------------|:---------------|:-----------|:------------|:----------|:--------------|:------------------|:------------|:------|:--------------|:---------------|:---------|:---------------|:----------------|:-----------|:---------|:----------|:------|:-------------|:-------------------|:----------|:-------------|:--------------|:-----------------|:-------------------|:--------------|:--------------|:----------------|:--------------|:---------------|:---------|:-------------------|:---------|:-----------------|:----------------|:--------------|:-------------|:--------------|:--------------|:--------------|:-------------|:-----------------|:---------------|:------------|:----------------|:-------------------|:-------------|:--------------|:-----------------|:-----------|:---------------|:------------|:-----------|:----------|:---------------|:-----------|:----------|:--------------|:----------------|:------------|:----------------|:------------------|:-----------------|:--------------|:---------------|:---------------|:-------------------------|:---------------|:---------------------|:------|:-----------|:-----------|:-------------|:----------------|:------------------|:--------------------|:----------------|:------------|:--------------|:--------------|:---------|:-------------------|:---------|:----------------|:--------------|:----------------|:-----------------|:--------------|:------------|:--------------------|:-------------|:--------------|:-----------|:----------------------|:------|:----------------|:--------------|:--------------|:------|:-------|:---------------|:-------------|:---------------|:---------|:-------------|:-------|:---------|:----------------------|:---------------|:-----------------|:--------------|:-----------|:--------|:----------------|:-------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | | X | X | X | X | X | | X | X | X | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 14 |  |  |  |  |  | X | | | X | X | X | | | | X | | X | | X | | | | X | | | X | | | | X | X | X | X | | X | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | | | X | X | | X | | | X | X | X | | | | | | X | | | X | | | | | X | X | X | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | | X | X | | X | X | | | X | | | X | X | | X | | | | X | | | | X | | | | | | | X | X | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 10 |  |  |  |  |  | X | | | X | X | X | | | | X | | X | | X | | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | | X | X | | X | | | X | X | X | | | X | | | | | | X | X | | | | | | X | | | X | | | | X | | | | | | | | | X | | | | | | | | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 73 |  |  |  |  |  | X | X | | X | X | | X | X | | X | X | | | | X | | X | | | X | X | X | | X | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 9 |  |  |  |  |  | X | X | | X | X | | X | | | X | | X | | | X | | | | | | X | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 6 |  |  |  |  |  | X | | | X | X | | X | | | X | | X | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 5 |  |  |  |  |  | X | | X | X | X | | X | X | | | | | | | | X | | X | | | | | | | | | | | | | | | | | X | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 12 | 7 |  |  |  |  |  | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
richardr1126/spider-natsql-context-validation | ---
language:
- en
license:
- cc-by-4.0
source_datasets:
- spider
tags:
- sql
- spider
- natsql
- text-to-sql
- sql finetune
dataset_info:
features:
- name: db_id
dtype: string
- name: prompt
dtype: string
- name: ground_truth
dtype: string
---
# Dataset Card for Spider NatSQL Context Validation
### Dataset Summary
[Spider](https://arxiv.org/abs/1809.08887) is a large-scale complex and cross-domain semantic parsing and text-to-SQL dataset annotated by 11 Yale students
The goal of the Spider challenge is to develop natural language interfaces to cross-domain databases.
This dataset was created to validate LLMs on the Spider dev dataset with database context using NatSQL.
### NatSQL
[NatSQL](https://arxiv.org/abs/2109.05153) is an intermediate representation for SQL that simplifies the queries and reduces the mismatch between
natural language and SQL. NatSQL preserves the core functionalities of SQL, but removes some clauses and keywords
that are hard to infer from natural language descriptions. NatSQL also makes schema linking easier by reducing the
number of schema items to predict. NatSQL can be easily converted to executable SQL queries and can improve the
performance of text-to-SQL models.
### Yale Lily Spider Leaderboards
The leaderboard can be seen at https://yale-lily.github.io/spider
### Languages
The text in the dataset is in English.
### Licensing Information
The spider dataset is licensed under
the [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/legalcode)
### Citation
```
@article{yu2018spider,
title={Spider: A large-scale human-labeled dataset for complex and cross-domain semantic parsing and text-to-sql task},
author={Yu, Tao and Zhang, Rui and Yang, Kai and Yasunaga, Michihiro and Wang, Dongxu and Li, Zifan and Ma, James and Li, Irene and Yao, Qingning and Roman, Shanelle and others},
journal={arXiv preprint arXiv:1809.08887},
year={2018}
}
```
```
@inproceedings{gan-etal-2021-natural-sql,
title = "Natural {SQL}: Making {SQL} Easier to Infer from Natural Language Specifications",
author = "Gan, Yujian and
Chen, Xinyun and
Xie, Jinxia and
Purver, Matthew and
Woodward, John R. and
Drake, John and
Zhang, Qiaofu",
booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2021",
month = nov,
year = "2021",
address = "Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.findings-emnlp.174",
doi = "10.18653/v1/2021.findings-emnlp.174",
pages = "2030--2042",
}
``` |
StankyDanko/testing-kp2 | ---
license: afl-3.0
---
|
BeIR/signal1m-generated-queries | ---
annotations_creators: []
language_creators: []
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: beir
pretty_name: BEIR Benchmark
size_categories:
msmarco:
- 1M<n<10M
trec-covid:
- 100k<n<1M
nfcorpus:
- 1K<n<10K
nq:
- 1M<n<10M
hotpotqa:
- 1M<n<10M
fiqa:
- 10K<n<100K
arguana:
- 1K<n<10K
touche-2020:
- 100K<n<1M
cqadupstack:
- 100K<n<1M
quora:
- 100K<n<1M
dbpedia:
- 1M<n<10M
scidocs:
- 10K<n<100K
fever:
- 1M<n<10M
climate-fever:
- 1M<n<10M
scifact:
- 1K<n<10K
source_datasets: []
task_categories:
- text-retrieval
- zero-shot-retrieval
- information-retrieval
- zero-shot-information-retrieval
task_ids:
- passage-retrieval
- entity-linking-retrieval
- fact-checking-retrieval
- tweet-retrieval
- citation-prediction-retrieval
- duplication-question-retrieval
- argument-retrieval
- news-retrieval
- biomedical-information-retrieval
- question-answering-retrieval
---
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset. |
jpwahle/autoregressive-paraphrase-dataset | ---
annotations_creators:
- machine-generated
language:
- en
language_creators:
- machine-generated
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: Machine Paraphrase Dataset (T5, GPT-3)
size_categories:
- 100K<n<1M
source_datasets:
- original
tags:
- plagiarism
- paraphrase
- academic integrity
- arxiv
- wikipedia
- theses
- bert
- roberta
- t5
- gpt-3
task_categories:
- text-classification
- text-generation
task_ids: []
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Size:** 163MB
- **Repository:** https://github.com/jpwahle/emnlp22-transforming
- **Paper:** https://arxiv.org/abs/2210.03568
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset. |
thanhduycao/soict_private_test | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: audio
sequence: float32
splits:
- name: train
num_bytes: 567721148
num_examples: 2139
download_size: 461186944
dataset_size: 567721148
---
# Dataset Card for "soict_private_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mutonix/RefGPT-Code-cr | ---
license: apache-2.0
dataset_info:
features:
- name: dialogue
dtype: string
- name: reference
dtype: string
- name: language
dtype: string
- name: type
dtype: string
splits:
- name: en
num_bytes: 165025559.5254741
num_examples: 14119
- name: zh
num_bytes: 157858797.9941188
num_examples: 15288
download_size: 136112295
dataset_size: 322884357.5195929
task_categories:
- conversational
language:
- zh
- en
arxiv: https://arxiv.org/abs/2305.14994
size_categories:
- 10K<n<100K
---
# Dataset Card for RefGPT-Code-cr
## Dataset Description
- **Homepage:**
- **Repository:** [https://github.com/ziliwangnlp/RefGPT](https://github.com/ziliwangnlp/RefGPT)
- **Paper:** [https://arxiv.org/abs/2305.14994](https://arxiv.org/abs/2305.14994)
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
<p align="center">
<a href="https://arxiv.org/abs/2305.14994"><b>[Paper] RefGPT</b></a> |
<a href="https://github.com/ziliwangnlp/RefGPT"><b>[Github] RefGPT</b></a>
</p>
RefGPT-Code is a dataset containing 76k multi-turn dialogues about programming with 37k English and 39k Chinese, which has covered most aspects of code usage scenarios and multiple types of programming languages. Both the English version and Chinese version use the public GitHub dataset on Google BiqQuery with no overlap in these two languages. RefGPT-Code has derived various ways of leveraging the program code as the reference to enable different scenarios. We consider three perspectives of code discussion, code creation and bug fixing in RefGPT-Code.
**RefGPT-Code-cr** is the "code creation" subset.
### Supported Tasks and Leaderboards
Chatbot instruction finetuning
### Languages
Chinese, English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
Please pay attention that RefGPT Datasets, including RefGPT-Fact and RefGPT-Code, have not undergone manual verification, and as such, their security cannot be strictly guaranteed. Users should be aware that they are responsible for the results generated using this data.
### Discussion of Biases
As the datasets RefGPT-Fact and RefGPT-Code are collected by using the references like Wikipedia and Github repositories, it can not be avoided that the reference itself has factual errors, typos, or bugs and malicious code if it is from Github repositories. The datasets may also reflect the biases of the selected references and GPT-3.5/GPT-4 model
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```bibtex
@misc{yang2023refgpt,
title={RefGPT: Reference -> Truthful & Customized Dialogues Generation by GPTs and for GPTs},
author={Dongjie Yang and Ruifeng Yuan and YuanTao Fan and YiFei Yang and Zili Wang and Shusen Wang and Hai Zhao},
year={2023},
eprint={2305.14994},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
[More Information Needed] |
thobauma/harmless-poisoned-0.04-SUDO-murder | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 58402939.44335993
num_examples: 42537
download_size: 31364075
dataset_size: 58402939.44335993
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nexdata/Mandarin_Spontaneous_Speech_Data_by_Mobile_Phone | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
task_categories:
- automatic-speech-recognition
language:
- zh
---
# Dataset Card for Nexdata/Mandarin_Spontaneous_Speech_Data_by_Mobile_Phone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/77?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The data were recorded by 700 Mandarin speakers, 65% of whom were women. There is no pre-made text, and speakers makes phone calls in a natural way while recording the contents of the calls. This data mainly labels the near-end speech, and the speech content is naturally colloquial.
For more details, please refer to the link: https://www.nexdata.ai/datasets/77?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Mandarin
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commercial License
### Citation Information
[More Information Needed]
### Contributions |
haosulab/ManiSkill | ---
license: apache-2.0
language:
- en
tags:
- robotics
- reinforcement learning
- embodied ai
- computer vision
- simulation
- Embodied AI
size_categories:
- 1M<n<10M
task_categories:
- reinforcement-learning
- robotics
viewer: false
---
# ManiSkill Data

[](https://badge.fury.io/py/mani-skill2) [](https://colab.research.google.com/github/haosulab/ManiSkill2/blob/main/examples/tutorials/1_quickstart.ipynb)
[](https://haosulab.github.io/ManiSkill2)
[](https://discord.gg/x8yUZe5AdN)
<!-- [](https://haosulab.github.io/ManiSkill2) -->
ManiSkill is a unified benchmark for learning generalizable robotic manipulation skills powered by [SAPIEN](https://sapien.ucsd.edu/). **It features 20 out-of-box task families with 2000+ diverse object models and 4M+ demonstration frames**. Moreover, it empowers fast visual input learning algorithms so that **a CNN-based policy can collect samples at about 2000 FPS with 1 GPU and 16 processes on a workstation**. The benchmark can be used to study a wide range of algorithms: 2D & 3D vision-based reinforcement learning, imitation learning, sense-plan-act, etc.
This is the huggingface datasets page for all data related to [ManiSkill2](https://github.com/haosulab/ManiSkill2),
including **assets, robot demonstrations, and pretrained models.** Note previously there is a ManiSkill and ManiSkill2, we are rebranding it all to just ManiSkill and the python package versioning tells you which iteration.
For detailed information about ManiSkill, head over to our [GitHub repository](https://github.com/haosulab/ManiSkill2), [website](https://maniskill2.github.io/), or [ICLR 2023 paper](https://arxiv.org/abs/2302.04659)
[documentation](https://maniskill.readthedocs.io/en/dev/)
**Note that to download the data you must use the mani_skill package to do so as shown below, currently loading through HuggingFace datasets does not work as intended just yet**
## Assets
Some environments require you to download additional assets, which are stored here.
You can download task-specific assets by running
```
python -m mani_skill.utils.download_asset ${ENV_ID}
```
## Demonstration Data
We provide a command line tool (mani_skill.utils.download_demo) to download demonstrations from here.
```
# Download the demonstration dataset for a specific task
python -m mani_skill2.utils.download_demo ${ENV_ID}
# Download the demonstration datasets for all rigid-body tasks to "./demos"
python -m mani_skill2.utils.download_demo rigid_body -o ./demos
```
To learn how to use the demonstrations and what environments are available, go to the demonstrations documentation page: https://maniskill.readthedocs.io/en/dev/user_guide/datasets/datasets.html
## License
All rigid body environments in ManiSkill are licensed under fully permissive licenses (e.g., Apache-2.0).
The assets are licensed under [CC BY-NC 4.0](https://creativecommons.org/licenses/by-nc/4.0/legalcode).
## Citation
If you use ManiSkill or its assets, models, and demonstrations, please cite using the following BibTeX entry for now:
```
@inproceedings{gu2023maniskill2,
title={ManiSkill2: A Unified Benchmark for Generalizable Manipulation Skills},
author={Gu, Jiayuan and Xiang, Fanbo and Li, Xuanlin and Ling, Zhan and Liu, Xiqiaing and Mu, Tongzhou and Tang, Yihe and Tao, Stone and Wei, Xinyue and Yao, Yunchao and Yuan, Xiaodi and Xie, Pengwei and Huang, Zhiao and Chen, Rui and Su, Hao},
booktitle={International Conference on Learning Representations},
year={2023}
}
```
A ManiSkill3 bibtex will be made later. |
ashhadulislam/arabic_medical_test | ---
license: mit
---
|
lucadiliello/wikiqa_grouped | ---
task_categories:
- text-classification
language:
- en
pretty_name: WikiQA
size_categories:
- 1K<n<10K
---
WikiQA dataset with answers grouped together for each question. |
EinsZwo/nlid_supertag_train_10k | ---
dataset_info:
features:
- name: lang
dtype: string
- name: doc
dtype: string
- name: supertags
dtype: string
splits:
- name: train
num_bytes: 138637740
num_examples: 119654
download_size: 52997334
dataset_size: 138637740
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
anan-2024/twitter_dataset_1713029677 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 26556
num_examples: 59
download_size: 14095
dataset_size: 26556
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
anubrag/Python-CodeExercises-Evol | ---
license: mit
---
|
juancopi81/jcpvincentcat | ---
license: openrail
---
|
nlpso/m2m3_qualitative_analysis_ref_ptrn_cmbert_iob2 | ---
language:
- fr
multilinguality:
- monolingual
task_categories:
- token-classification
---
# m2m3_qualitative_analysis_ref_ptrn_cmbert_iob2
## Introduction
This dataset was used to perform **qualitative analysis** of [HueyNemud/das22-10-camembert_pretrained](https://huggingface.co/HueyNemud/das22-10-camembert_pretrained) on **nested NER task** using Independant NER layers approach [M1].
It contains Paris trade directories entries from the 19th century.
## Dataset parameters
* Approachrd : M2 and M3
* Dataset type : ground-truth
* Tokenizer : [HueyNemud/das22-10-camembert_pretrained](https://huggingface.co/HueyNemud/das22-10-camembert_pretrained)
* Tagging format : IOB2
* Counts :
* Train : 6084
* Dev : 676
* Test : 1685
* Associated fine-tuned models :
* M2 : [nlpso/m2_joint_label_ref_ptrn_cmbert_iob2](https://huggingface.co/nlpso/m2_joint_label_ref_ptrn_cmbert_iob2)
* M3 : [nlpso/m3_hierarchical_ner_ref_ptrn_cmbert_iob2](https://huggingface.co/nlpso/m3_hierarchical_ner_ref_ptrn_cmbert_iob2)
## Entity types
Abbreviation|Entity group (level)|Description
-|-|-
O |1 & 2|Outside of a named entity
PER |1|Person or company name
ACT |1 & 2|Person or company professional activity
TITREH |2|Military or civil distinction
DESC |1|Entry full description
TITREP |2|Professionnal reward
SPAT |1|Address
LOC |2|Street name
CARDINAL |2|Street number
FT |2|Geographical feature
## How to use this dataset
```python
from datasets import load_dataset
train_dev_test = load_dataset("nlpso/m2m3_qualitative_analysis_ref_ptrn_cmbert_iob2")
|
yn01/test_20240125_01 | ---
dataset_info:
features:
- name: text
struct:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 104125
num_examples: 614
download_size: 21717
dataset_size: 104125
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
andersonbcdefg/beir_generated_queries_pairs | ---
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
splits:
- name: train
num_bytes: 6760460815
num_examples: 9394910
download_size: 4045155599
dataset_size: 6760460815
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FINNUMBER/FINCH_TRAIN_SA_FPB_400 | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: 'null'
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 529689
num_examples: 400
download_size: 224561
dataset_size: 529689
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
oviniciusc/blaze | ---
license: lgpl-3.0
---
|
open-llm-leaderboard/details_NeuralNovel__Tiger-7b-v0.1 | ---
pretty_name: Evaluation run of NeuralNovel/Tiger-7b-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NeuralNovel/Tiger-7b-v0.1](https://huggingface.co/NeuralNovel/Tiger-7b-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NeuralNovel__Tiger-7b-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-19T00:30:17.528076](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Tiger-7b-v0.1/blob/main/results_2024-01-19T00-30-17.528076.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6140276670516231,\n\
\ \"acc_stderr\": 0.033170625938141664,\n \"acc_norm\": 0.6176750246888549,\n\
\ \"acc_norm_stderr\": 0.03384254890386933,\n \"mc1\": 0.44920440636474906,\n\
\ \"mc1_stderr\": 0.017412941986115305,\n \"mc2\": 0.6103468565333238,\n\
\ \"mc2_stderr\": 0.015326695061753768\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5588737201365188,\n \"acc_stderr\": 0.014509747749064663,\n\
\ \"acc_norm\": 0.5998293515358362,\n \"acc_norm_stderr\": 0.014317197787809169\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6415056761601274,\n\
\ \"acc_stderr\": 0.004785781979354866,\n \"acc_norm\": 0.832105158334993,\n\
\ \"acc_norm_stderr\": 0.0037300899105375805\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.0250437573185202,\n \"acc_norm\"\
: 0.3835978835978836,\n \"acc_norm_stderr\": 0.0250437573185202\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n\
\ \"acc_stderr\": 0.026985289576552732,\n \"acc_norm\": 0.6580645161290323,\n\
\ \"acc_norm_stderr\": 0.026985289576552732\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5871794871794872,\n \"acc_stderr\": 0.024962683564331796,\n\
\ \"acc_norm\": 0.5871794871794872,\n \"acc_norm_stderr\": 0.024962683564331796\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8146788990825689,\n \"acc_stderr\": 0.01665927970029582,\n \"\
acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.01665927970029582\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159263,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159263\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419996,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419996\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.041184385658062976,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.041184385658062976\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097654,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097654\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.02220930907316561,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.02220930907316561\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7994891443167306,\n\
\ \"acc_stderr\": 0.014317653708594202,\n \"acc_norm\": 0.7994891443167306,\n\
\ \"acc_norm_stderr\": 0.014317653708594202\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.025131000233647886,\n\
\ \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.025131000233647886\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.441340782122905,\n\
\ \"acc_stderr\": 0.016607021781050876,\n \"acc_norm\": 0.441340782122905,\n\
\ \"acc_norm_stderr\": 0.016607021781050876\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.02623696588115327,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.02623696588115327\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409828,\n\
\ \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409828\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291463,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291463\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4276401564537158,\n\
\ \"acc_stderr\": 0.012635799922765846,\n \"acc_norm\": 0.4276401564537158,\n\
\ \"acc_norm_stderr\": 0.012635799922765846\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.029349803139765873,\n\
\ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.029349803139765873\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6356209150326797,\n \"acc_stderr\": 0.019469518221573695,\n \
\ \"acc_norm\": 0.6356209150326797,\n \"acc_norm_stderr\": 0.019469518221573695\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n\
\ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.7661691542288557,\n\
\ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333047,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333047\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44920440636474906,\n\
\ \"mc1_stderr\": 0.017412941986115305,\n \"mc2\": 0.6103468565333238,\n\
\ \"mc2_stderr\": 0.015326695061753768\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.01170569756520521\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.467778620166793,\n \
\ \"acc_stderr\": 0.013743857303073802\n }\n}\n```"
repo_url: https://huggingface.co/NeuralNovel/Tiger-7b-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|arc:challenge|25_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|gsm8k|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hellaswag|10_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T00-30-17.528076.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-19T00-30-17.528076.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- '**/details_harness|winogrande|5_2024-01-19T00-30-17.528076.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-19T00-30-17.528076.parquet'
- config_name: results
data_files:
- split: 2024_01_19T00_30_17.528076
path:
- results_2024-01-19T00-30-17.528076.parquet
- split: latest
path:
- results_2024-01-19T00-30-17.528076.parquet
---
# Dataset Card for Evaluation run of NeuralNovel/Tiger-7b-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NeuralNovel/Tiger-7b-v0.1](https://huggingface.co/NeuralNovel/Tiger-7b-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NeuralNovel__Tiger-7b-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-19T00:30:17.528076](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Tiger-7b-v0.1/blob/main/results_2024-01-19T00-30-17.528076.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6140276670516231,
"acc_stderr": 0.033170625938141664,
"acc_norm": 0.6176750246888549,
"acc_norm_stderr": 0.03384254890386933,
"mc1": 0.44920440636474906,
"mc1_stderr": 0.017412941986115305,
"mc2": 0.6103468565333238,
"mc2_stderr": 0.015326695061753768
},
"harness|arc:challenge|25": {
"acc": 0.5588737201365188,
"acc_stderr": 0.014509747749064663,
"acc_norm": 0.5998293515358362,
"acc_norm_stderr": 0.014317197787809169
},
"harness|hellaswag|10": {
"acc": 0.6415056761601274,
"acc_stderr": 0.004785781979354866,
"acc_norm": 0.832105158334993,
"acc_norm_stderr": 0.0037300899105375805
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.0250437573185202,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.0250437573185202
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.026985289576552732,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.026985289576552732
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5871794871794872,
"acc_stderr": 0.024962683564331796,
"acc_norm": 0.5871794871794872,
"acc_norm_stderr": 0.024962683564331796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608301,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608301
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.01665927970029582,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.01665927970029582
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419996,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419996
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.041184385658062976,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.041184385658062976
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097654,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097654
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.02220930907316561,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.02220930907316561
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7994891443167306,
"acc_stderr": 0.014317653708594202,
"acc_norm": 0.7994891443167306,
"acc_norm_stderr": 0.014317653708594202
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6791907514450867,
"acc_stderr": 0.025131000233647886,
"acc_norm": 0.6791907514450867,
"acc_norm_stderr": 0.025131000233647886
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.441340782122905,
"acc_stderr": 0.016607021781050876,
"acc_norm": 0.441340782122905,
"acc_norm_stderr": 0.016607021781050876
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.02623696588115327,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.02623696588115327
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409828,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409828
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291463,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291463
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4276401564537158,
"acc_stderr": 0.012635799922765846,
"acc_norm": 0.4276401564537158,
"acc_norm_stderr": 0.012635799922765846
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.029349803139765873,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.029349803139765873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6356209150326797,
"acc_stderr": 0.019469518221573695,
"acc_norm": 0.6356209150326797,
"acc_norm_stderr": 0.019469518221573695
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333047,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333047
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.44920440636474906,
"mc1_stderr": 0.017412941986115305,
"mc2": 0.6103468565333238,
"mc2_stderr": 0.015326695061753768
},
"harness|winogrande|5": {
"acc": 0.77663772691397,
"acc_stderr": 0.01170569756520521
},
"harness|gsm8k|5": {
"acc": 0.467778620166793,
"acc_stderr": 0.013743857303073802
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
llm-lens/descriptors-text-davinci-003 | ---
dataset_info:
features:
- name: vocab
dtype: string
- name: descriptions
sequence: string
- name: prompt_descriptions
sequence: string
splits:
- name: birdsnap
num_bytes: 322488
num_examples: 500
- name: caltech101
num_bytes: 56880
num_examples: 102
- name: cifar100
num_bytes: 54081
num_examples: 100
- name: coco
num_bytes: 45186
num_examples: 80
- name: dtd
num_bytes: 25204
num_examples: 47
- name: fgvc
num_bytes: 74126
num_examples: 100
- name: food101
num_bytes: 58525
num_examples: 101
- name: full
num_bytes: 3000578
num_examples: 4951
- name: imagenet21k
num_bytes: 604897
num_examples: 998
- name: lvis
num_bytes: 679195
num_examples: 1198
- name: oxfordflowers
num_bytes: 58560
num_examples: 102
- name: oxfordpets
num_bytes: 22322
num_examples: 37
- name: stanfordcars
num_bytes: 157786
num_examples: 196
- name: sun397
num_bytes: 243017
num_examples: 362
- name: visualgenome
num_bytes: 1092697
num_examples: 1913
download_size: 3257945
dataset_size: 6495542
---
# Dataset Card for "descriptors-text-davinci-003"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DiegoRoberto10/diegorobert | ---
license: openrail
---
|
Enagamirzayev/llm-lingo_test | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
- name: start_time
dtype: string
- name: end_time
dtype: string
splits:
- name: train
num_bytes: 1273208.0
num_examples: 6
- name: validation
num_bytes: 1229380.0
num_examples: 5
download_size: 2508853
dataset_size: 2502588.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
Nexdata/1000_People_Italian_Handwriting_OCR_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
1,000 People - Italian Handwriting OCR Data. The writers are Europeans who often write Italian. The device is scanner, the collection angle is eye-level angle. The dataset content includes address, company name, personal name.The dataset can be used for tasks such as Italian handwriting OCR.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1406?source=Huggingface
## Data size
1,000 people, each subject collects 14 images
## Population distribution
gender distribution: 548 males, 452 females; age distribution: 6 people under 18 years old, 982 people from 18 to 45 years old, 8 people from 46 to 60 years old, 4 people over 60 years old
## Writer
Europeans who often write Italian
## Collecting environment
pure color background
## Device
scanner
## Photographic angle
eye-level angle
## Data format
the image data format is .png
## Data content
including address, company name and personal name, each image has 20 writing boxes
## Accuracy rate
the collection content accuracy is not less than 97%
# Licensing Information
Commercial License
|
open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_CyberTron_Coder | ---
pretty_name: Evaluation run of LeroyDyer/Mixtral_AI_CyberTron_Coder
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [LeroyDyer/Mixtral_AI_CyberTron_Coder](https://huggingface.co/LeroyDyer/Mixtral_AI_CyberTron_Coder)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_CyberTron_Coder\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T20:56:47.877171](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_CyberTron_Coder/blob/main/results_2024-04-15T20-56-47.877171.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6281091573825272,\n\
\ \"acc_stderr\": 0.03253581913198614,\n \"acc_norm\": 0.6315388500683832,\n\
\ \"acc_norm_stderr\": 0.03318496235217278,\n \"mc1\": 0.41370869033047736,\n\
\ \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.5970736025002641,\n\
\ \"mc2_stderr\": 0.015056441504665573\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.014356399418009124,\n\
\ \"acc_norm\": 0.6424914675767918,\n \"acc_norm_stderr\": 0.014005494275916573\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6496713802031467,\n\
\ \"acc_stderr\": 0.004760978203023326,\n \"acc_norm\": 0.8426608245369448,\n\
\ \"acc_norm_stderr\": 0.0036337609492583505\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.03761070869867479,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.03761070869867479\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932264,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932264\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n\
\ \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n\
\ \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397453,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397453\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.024321738484602354,\n\
\ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.024321738484602354\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977924,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977924\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8293577981651377,\n \"acc_stderr\": 0.016129271025099843,\n \"\
acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.016129271025099843\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912046,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912046\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.02363687331748928,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.02363687331748928\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n\
\ \"acc_stderr\": 0.014283378044296418,\n \"acc_norm\": 0.8007662835249042,\n\
\ \"acc_norm_stderr\": 0.014283378044296418\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657762,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657762\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4022346368715084,\n\
\ \"acc_stderr\": 0.016399716732847142,\n \"acc_norm\": 0.4022346368715084,\n\
\ \"acc_norm_stderr\": 0.016399716732847142\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n\
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890155,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890155\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.02971928127223684,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.02971928127223684\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44132985658409385,\n\
\ \"acc_stderr\": 0.012682016335646671,\n \"acc_norm\": 0.44132985658409385,\n\
\ \"acc_norm_stderr\": 0.012682016335646671\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.028888193103988637,\n\
\ \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.028888193103988637\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6258169934640523,\n \"acc_stderr\": 0.01957695312208883,\n \
\ \"acc_norm\": 0.6258169934640523,\n \"acc_norm_stderr\": 0.01957695312208883\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774711,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774711\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41370869033047736,\n\
\ \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.5970736025002641,\n\
\ \"mc2_stderr\": 0.015056441504665573\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090259\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.510235026535254,\n \
\ \"acc_stderr\": 0.013769598923012397\n }\n}\n```"
repo_url: https://huggingface.co/LeroyDyer/Mixtral_AI_CyberTron_Coder
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|arc:challenge|25_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|gsm8k|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hellaswag|10_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T20-56-47.877171.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T20-56-47.877171.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- '**/details_harness|winogrande|5_2024-04-15T20-56-47.877171.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T20-56-47.877171.parquet'
- config_name: results
data_files:
- split: 2024_04_15T20_56_47.877171
path:
- results_2024-04-15T20-56-47.877171.parquet
- split: latest
path:
- results_2024-04-15T20-56-47.877171.parquet
---
# Dataset Card for Evaluation run of LeroyDyer/Mixtral_AI_CyberTron_Coder
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [LeroyDyer/Mixtral_AI_CyberTron_Coder](https://huggingface.co/LeroyDyer/Mixtral_AI_CyberTron_Coder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_CyberTron_Coder",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T20:56:47.877171](https://huggingface.co/datasets/open-llm-leaderboard/details_LeroyDyer__Mixtral_AI_CyberTron_Coder/blob/main/results_2024-04-15T20-56-47.877171.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6281091573825272,
"acc_stderr": 0.03253581913198614,
"acc_norm": 0.6315388500683832,
"acc_norm_stderr": 0.03318496235217278,
"mc1": 0.41370869033047736,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.5970736025002641,
"mc2_stderr": 0.015056441504665573
},
"harness|arc:challenge|25": {
"acc": 0.5930034129692833,
"acc_stderr": 0.014356399418009124,
"acc_norm": 0.6424914675767918,
"acc_norm_stderr": 0.014005494275916573
},
"harness|hellaswag|10": {
"acc": 0.6496713802031467,
"acc_stderr": 0.004760978203023326,
"acc_norm": 0.8426608245369448,
"acc_norm_stderr": 0.0036337609492583505
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.03761070869867479,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.03761070869867479
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932264,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932264
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424649,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424649
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397453,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397453
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.024321738484602354,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.024321738484602354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977924,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977924
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099843,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099843
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912046,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912046
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.02363687331748928,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.02363687331748928
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.014283378044296418,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.014283378044296418
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.02418242749657762,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.02418242749657762
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4022346368715084,
"acc_stderr": 0.016399716732847142,
"acc_norm": 0.4022346368715084,
"acc_norm_stderr": 0.016399716732847142
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890155,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890155
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.02971928127223684,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.02971928127223684
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44132985658409385,
"acc_stderr": 0.012682016335646671,
"acc_norm": 0.44132985658409385,
"acc_norm_stderr": 0.012682016335646671
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.028888193103988637,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.028888193103988637
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6258169934640523,
"acc_stderr": 0.01957695312208883,
"acc_norm": 0.6258169934640523,
"acc_norm_stderr": 0.01957695312208883
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774711,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774711
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41370869033047736,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.5970736025002641,
"mc2_stderr": 0.015056441504665573
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.011524466954090259
},
"harness|gsm8k|5": {
"acc": 0.510235026535254,
"acc_stderr": 0.013769598923012397
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Oumar199/Nalohou_climatic_time_series | ---
task_categories:
- time-series-forecasting
language:
- en
pretty_name: Sub-Saharan-Time-Series-Forecasting
--- |
ShareGPTVideo/train_video_and_instruction | ---
license: apache-2.0
task_categories:
- question-answering
- other
language:
- en
tags:
- GPT-4V
- video
size_categories:
- n < 1M
---
# ShareGPTVideo Training Data
All dataset and models can be found at [ShareGPTVideo](https://huggingface.co/ShareGPTVideo).
# Contents:
- [Train 300k video frames](https://huggingface.co/datasets/ShareGPTVideo/train_video_and_instruction/tree/main/train_300k): contains video frames used for SFT and DPO model, which is a subset of total 900k.
ActivityNet 50k + vidal 150k + webvid 100k.
- [Train 600k video frames](https://huggingface.co/datasets/ShareGPTVideo/train_video_and_instruction/tree/main/train_600k): contains the rest 600k frames, the total 900k frames are used for pre-training stage. If you just do finetuning using our video QA, you can just download the 300k above.
900k composition is 400k WebVid + 450k Vidal + 50k ActivityNet
- [Instruction for DPO](https://huggingface.co/datasets/ShareGPTVideo/train_video_and_instruction/tree/main/video_instruction/train/dpo): 17k video preference data.
**Training pipeline** refer to [LLaVA-Hound-DPO training](https://github.com/RifleZhang/LLaVA-Hound-DPO/blob/main/llava_hound_dpo/dpo_scripts/README.md)
- [900k Detailed Video Caption](https://huggingface.co/datasets/ShareGPTVideo/train_video_and_instruction/blob/main/video_instruction/train/pretrain/video_caption_pretrain.jsonl): 900k video caption used for pretrain.
- [900k Video QA](https://huggingface.co/datasets/ShareGPTVideo/train_video_and_instruction/blob/main/video_instruction/train/qa/chatgpt_qa_900k.jsonl): For the 300k video frames above, we generate 3 qa pairs for each, in total 900k. We only used 240k subset for SFT.
# Set up:
```bash
git clone git@github.com:RifleZhang/LLaVA-Hound-DPO.git
# fill in requirement path and token at: https://github.com/RifleZhang/LLaVA-Hound-DPO/blob/main/setup/set_path.sh
source setup/setup_env.sh
source setup/setup_train_data.sh
```
**Video Frames**:
```bash
video_data
└── train
...
keep updating
```
**Train instruction**:
```bash
video_instruction
├── train
│ ├── dpo
...
keep updating
```
# Preview examples
<details><summary>DPO Data</summary>
```bash
{'id': '--HUaAQKqgA_2',
'prompt': 'How does the background fabric enhance the decorative appeal of the mirror cover?',
'answer': 'The background fabric on which the mirror cover is displayed is golden with a shiny, patterned texture, which enhances the overall decorative appeal of the cover.',
'video': '--HUaAQKqgA',
'chosen': 'The background fabric has a gold hue, which adds to the reflective shine of the various stones and pearls that constitute the mirror cover, making it look even more ornate and luxurious.',
'rejected': 'The background fabric enhances the decorative appeal of the mirror cover by providing a luxurious golden setting for the mirror.',
'chosen_score': 5.0,
'rejected_score': 3.0
}
```
</details>
<details><summary>Detailed Caption Data</summary>
```bash
{
'id': 'xWwFJDHQQE4',
'video': 'xWwFJDHQQE4',
'conversations':
[
{
'from': 'human',
'value': "Provide a comprehensive analysis of the video's content and themes.\n<video>"
},
{
'from': 'gpt',
'value': 'The video opens with water coming to a boil in a stainless-steel pot. The second frame presents a close-up of a yellow packet of instant noodles with red and green accents, displaying text and images indicative of a masala flavor.
The subsequent frame shows a hand holding a seasoning sachet, preparing to tear it open. Another frame captures the block of dry noodles being placed into the boiling water, followed by a scene where a fork is used to stir the noodles as they cook.
The noodles change in consistency as they absorb the water and seasoning, turning softer and taking on a yellowish hue. As the video progresses, the fully cooked noodles are shown being served in a silver pot and then transferred onto a floral-design plate.
The final frame displays two children seated on a brown striped sofa, each with a plate of the prepared noodles, looking happy.'
}
]
}
```
</details>
<details><summary>Video QA data</summary>
```bash
{
'id': '---uhMEpio8_0',
'video': '---uhMEpio8',
'conversations':
[
{
'from': 'human',
'value': '<video>\nHow would you describe the design of the architectural structure in the video?'
},
{
'from': 'gpt',
'value': 'The architectural structure in the video has a tilted and angular appearance that defies conventional building structures.'
}
]
}
```
</details>
<br>
# Issues and Requests
send to [project page](https://github.com/RifleZhang/LLaVA-Hound-DPO/tree/main)
# Citation
```
@misc{zhang2024direct,
title={Direct Preference Optimization of Video Large Multimodal Models from Language Model Reward},
author={Ruohong Zhang and Liangke Gui and Zhiqing Sun and Yihao Feng and Keyang Xu and Yuanhan Zhang and Di Fu and Chunyuan Li and Alexander Hauptmann and Yonatan Bisk and Yiming Yang},
year={2024},
eprint={2404.01258},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
``` |
Rami/utd_reddit.json | ---
dataset_info:
features:
- name: j52edo
struct:
- name: title
dtype: string
- name: selftext
dtype: string
- name: author
dtype: string
- name: num_comments
dtype: int64
- name: permalink
dtype: string
- name: url
dtype: string
- name: comments
struct:
- name: g7p723l
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7pmgai
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: replies
struct:
- name: g7q0gtr
struct:
- name: body
dtype: string
- name: g7p6z8q
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7q37rw
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: replies
struct:
- name: g7qjj6o
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7p4ynr
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7paxsm
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7p543c
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: replies
struct:
- name: g7pvhwr
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7qgcr3
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7p8y1o
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7pajp9
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7pn8t5
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: replies
struct:
- name: g7psgy5
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7s767n
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7qrjeu
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: replies
struct:
- name: g7r3brk
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7q48td
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7q3j2n
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: replies
struct:
- name: g7ujauu
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7pt766
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7pyov9
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: replies
struct:
- name: g7q1j3w
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7qvvrm
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: replies
struct:
- name: g7t8u30
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7sqe5g
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: gn3icng
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: replies
struct:
- name: gn3id7g
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7qjzq9
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: grxwrut
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: is1ekdj
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7q0gtr
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: replies
struct:
- name: g7qn1hx
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7qjj6o
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: replies
struct:
- name: g7tdb88
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7pvhwr
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7psgy5
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: replies
struct:
- name: g7psssg
struct:
- name: body
dtype: string
- name: g7r3brk
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7ujauu
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: replies
struct:
- name: g7ujcwo
struct:
- name: body
dtype: string
- name: g7q1j3w
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: replies
struct:
- name: g7q1ukv
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7t8u30
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: gn3id7g
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7qn1hx
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7tdb88
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7psssg
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: replies
struct:
- name: g7qvgs1
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7ujcwo
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7q1ukv
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
- name: g7qvgs1
struct:
- name: body
dtype: string
- name: author
dtype: string
- name: score
dtype: int64
splits:
- name: train
num_bytes: 5510
num_examples: 1
download_size: 94050
dataset_size: 5510
---
# Dataset Card for "utd_reddit.json"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
indicbench/truthfulqa_or | ---
dataset_info:
- config_name: default
features:
- name: _data_files
list:
- name: filename
dtype: string
- name: _fingerprint
dtype: string
- name: _format_columns
dtype: 'null'
- name: _format_type
dtype: 'null'
- name: _output_all_columns
dtype: bool
- name: _split
dtype: 'null'
splits:
- name: train
num_bytes: 107
num_examples: 2
download_size: 3274
dataset_size: 107
- config_name: generation
features:
- name: type
dtype: string
- name: category
dtype: string
- name: question
dtype: string
- name: best_answer
dtype: string
- name: correct_answers
sequence: string
- name: incorrect_answers
sequence: string
- name: source
dtype: string
splits:
- name: validation
num_bytes: 1091991
num_examples: 817
download_size: 347244
dataset_size: 1091991
- config_name: multiple_choice
features:
- name: question
dtype: string
- name: mc1_targets
struct:
- name: choices
sequence: string
- name: labels
sequence: int64
- name: mc2_targets
struct:
- name: choices
sequence: string
- name: labels
sequence: int64
splits:
- name: validation
num_bytes: 1516137
num_examples: 817
download_size: 450345
dataset_size: 1516137
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: generation
data_files:
- split: validation
path: generation/validation-*
- config_name: multiple_choice
data_files:
- split: validation
path: multiple_choice/validation-*
---
|
AiresPucrs/stanford-encyclopedia-philosophy | ---
language:
- en
task_categories:
- text-classification
- text-generation
dataset_info:
features:
- name: metadata
dtype: string
- name: text
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 146747420
num_examples: 182531
download_size: 72070745
dataset_size: 146747420
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: other
---
# Stanford Encyclopedia Philosophy
## Overview
The Stanford Encyclopedia of Philosophy (SEP) is a dynamic reference work, including over 1,770 entries written by top scholars in the field of philosophy.
This dataset contains the full text of all articles contained within the SEP.
Every row possesses information related to the original page (URL), the subject of the page (Category), and the text of the page (Text).
This dataset can be used for NLP applications like text mining, classification, and generation.
## Dataset Details
We will create a text dataset using the articles from the [`Stanford Encyclopedia of Philosophy`](https://plato.stanford.edu/)
```
Title: The Stanford Encyclopedia of Philosophy
https://plato.stanford.edu/
Publisher:
The Metaphysics Research Lab
Philosophy Department
Stanford University
Stanford, CA 94305-4115
International Standard Serial Number: ISSN 1095-5054
```
- Dataset Name: [stanford-encyclopedia-philosophy](https://huggingface.co/datasets/AiresPucrs/stanford-encyclopedia-philosophy)
- Language: English
- Total Size: 182,531 demonstrations
## Contents
The dataset consists of a data frame with the following columns:
- metadata
- label
- category
```bash
{
"metadata": https://plato.stanford.edu/entries/abduction/,
"text": "See also the entry on scientific discovery, in particular the section on discovery as abduction.",
"category": abduction
}
```
## How to use
```python
from datasets import load_dataset
dataset = load_dataset("AiresPucrs/stanford-encyclopedia-philosophy", split='train')
```
## License
The Stanford Encyclopedia of Philosophy Dataset is licensed under the [Other](https://plato.stanford.edu/info.html).
# Cite as
```
@misc{teenytinycastle,
doi = {10.5281/zenodo.7112065},
url = {https://github.com/Nkluge-correa/TeenyTinyCastle},
author = {Nicholas Kluge Corr{\^e}a},
title = {Teeny-Tiny Castle},
year = {2024},
publisher = {GitHub},
journal = {GitHub repository}
}
```
**Disclaimer**
This dataset is provided as is, without any warranty or guarantee of its accuracy or suitability for any purpose.
The creators and contributors of this dataset are not liable for any damages or losses arising from its use.
Please review and comply with the licenses and terms of the original datasets before use. |
Somasankalp93/Daylifecycle | ---
license: apache-2.0
---
|
Kaue123456/PauloTeixeiraPatati | ---
license: openrail
---
|
hacktoberfest-corpus-es/spanish_dish_instruction | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: text
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 163569776.9644282
num_examples: 4416
- name: test
num_bytes: 8142090.336714364
num_examples: 221
- name: valid
num_bytes: 31971355.346857455
num_examples: 884
download_size: 206512305
dataset_size: 203683222.648
---
|
kimgahyeon/testtrain | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 88225938
num_examples: 60260
download_size: 15196617
dataset_size: 88225938
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/med_alpaca_standardized_cluster_57_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 6563054
num_examples: 4440
download_size: 3192677
dataset_size: 6563054
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_57_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
McSpicyWithMilo/directions-0.2split-new-move | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: direction
dtype: string
splits:
- name: train
num_bytes: 8275
num_examples: 80
- name: test
num_bytes: 2044
num_examples: 20
download_size: 7747
dataset_size: 10319
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "directions-0.2split-new-move"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vwxyzjn/ultrachat_200k_filtered_1710165338 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_token
sequence: int64
- name: query_reference_response
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_reference_response_token
sequence: int64
- name: query_reference_response_token_len
dtype: int64
- name: query_token_len
dtype: int64
- name: reference_response
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: reference_response_token
sequence: int64
- name: reference_response_token_len
dtype: int64
splits:
- name: train_sft
num_bytes: 2329511242.072162
num_examples: 80035
- name: test_sft
num_bytes: 261532088.877196
num_examples: 8992
download_size: 493338154
dataset_size: 2591043330.949358
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
---
# Args
```python
{'base_model': 'mistralai/Mistral-7B-v0.1',
'check_length_correctness': True,
'debug': False,
'hf_entity': 'vwxyzjn',
'params': TaskQueryHParams(length=None,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding='pad_token',
pad_token=[32000],
pad_side='left',
max_query_length=1024,
max_sft_query_response_length=1280,
max_sft_response_length=256,
max_rm_query_response_length=1280,
max_rm_response_length=256),
'push_to_hub': True}
```
|
open-llm-leaderboard/details_postbot__distilgpt2-emailgen-V2 | ---
pretty_name: Evaluation run of postbot/distilgpt2-emailgen-V2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [postbot/distilgpt2-emailgen-V2](https://huggingface.co/postbot/distilgpt2-emailgen-V2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_postbot__distilgpt2-emailgen-V2_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-13T13:28:50.616028](https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__distilgpt2-emailgen-V2_public/blob/main/results_2023-11-13T13-28-50.616028.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2542066525769912,\n\
\ \"acc_stderr\": 0.030683618404772357,\n \"acc_norm\": 0.2547326716552163,\n\
\ \"acc_norm_stderr\": 0.031502030622377816,\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.015572840452875828,\n \"mc2\": 0.4651319733972654,\n\
\ \"mc2_stderr\": 0.016103347289806055,\n \"em\": 0.0,\n \"\
em_stderr\": 0.0,\n \"f1\": 0.003143875838926175,\n \"f1_stderr\"\
: 0.00031171556932365637\n },\n \"harness|arc:challenge|25\": {\n \"\
acc\": 0.1689419795221843,\n \"acc_stderr\": 0.01094979565248503,\n \
\ \"acc_norm\": 0.2098976109215017,\n \"acc_norm_stderr\": 0.011900548748047442\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26598287193786097,\n\
\ \"acc_stderr\": 0.004409521343140109,\n \"acc_norm\": 0.26777534355706034,\n\
\ \"acc_norm_stderr\": 0.004418948941099411\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.16447368421052633,\n \"acc_stderr\": 0.03016753346863271,\n\
\ \"acc_norm\": 0.16447368421052633,\n \"acc_norm_stderr\": 0.03016753346863271\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22641509433962265,\n \"acc_stderr\": 0.025757559893106744,\n\
\ \"acc_norm\": 0.22641509433962265,\n \"acc_norm_stderr\": 0.025757559893106744\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\
\ \"acc_stderr\": 0.0321473730202947,\n \"acc_norm\": 0.23121387283236994,\n\
\ \"acc_norm_stderr\": 0.0321473730202947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087785,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.15,\n \"acc_stderr\": 0.035887028128263714,\n \"acc_norm\": 0.15,\n\
\ \"acc_norm_stderr\": 0.035887028128263714\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.23829787234042554,\n \"acc_stderr\": 0.02785125297388979,\n\
\ \"acc_norm\": 0.23829787234042554,\n \"acc_norm_stderr\": 0.02785125297388979\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924811,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924811\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15873015873015872,\n\
\ \"acc_stderr\": 0.03268454013011743,\n \"acc_norm\": 0.15873015873015872,\n\
\ \"acc_norm_stderr\": 0.03268454013011743\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\
\ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\
\ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.20606060606060606,\n \"acc_stderr\": 0.03158415324047707,\n\
\ \"acc_norm\": 0.20606060606060606,\n \"acc_norm_stderr\": 0.03158415324047707\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35858585858585856,\n \"acc_stderr\": 0.03416903640391521,\n \"\
acc_norm\": 0.35858585858585856,\n \"acc_norm_stderr\": 0.03416903640391521\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178256,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178256\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128013,\n\
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128013\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.02665353159671548,\n\
\ \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02665353159671548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473835,\n \"\
acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473835\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21467889908256882,\n \"acc_stderr\": 0.01760430414925648,\n \"\
acc_norm\": 0.21467889908256882,\n \"acc_norm_stderr\": 0.01760430414925648\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2062780269058296,\n\
\ \"acc_stderr\": 0.027157150479563824,\n \"acc_norm\": 0.2062780269058296,\n\
\ \"acc_norm_stderr\": 0.027157150479563824\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20512820512820512,\n\
\ \"acc_stderr\": 0.02645350805404035,\n \"acc_norm\": 0.20512820512820512,\n\
\ \"acc_norm_stderr\": 0.02645350805404035\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2656449553001277,\n\
\ \"acc_stderr\": 0.015794302487888726,\n \"acc_norm\": 0.2656449553001277,\n\
\ \"acc_norm_stderr\": 0.015794302487888726\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.02207570925175717,\n\
\ \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.02207570925175717\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.238562091503268,\n \"acc_stderr\": 0.024404394928087873,\n\
\ \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.024404394928087873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02540719779889016,\n\
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02540719779889016\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432407,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432407\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n\
\ \"acc_stderr\": 0.010896123652676651,\n \"acc_norm\": 0.2392438070404172,\n\
\ \"acc_norm_stderr\": 0.010896123652676651\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4264705882352941,\n \"acc_stderr\": 0.030042615832714854,\n\
\ \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.030042615832714854\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26633986928104575,\n \"acc_stderr\": 0.017883188134667178,\n \
\ \"acc_norm\": 0.26633986928104575,\n \"acc_norm_stderr\": 0.017883188134667178\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2163265306122449,\n \"acc_stderr\": 0.026358916334904035,\n\
\ \"acc_norm\": 0.2163265306122449,\n \"acc_norm_stderr\": 0.026358916334904035\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n\
\ \"acc_stderr\": 0.035509201856896294,\n \"acc_norm\": 0.29518072289156627,\n\
\ \"acc_norm_stderr\": 0.035509201856896294\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.015572840452875828,\n \"mc2\": 0.4651319733972654,\n\
\ \"mc2_stderr\": 0.016103347289806055\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5201262825572218,\n \"acc_stderr\": 0.01404109666434433\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0,\n \"em_stderr\"\
: 0.0,\n \"f1\": 0.003143875838926175,\n \"f1_stderr\": 0.00031171556932365637\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/postbot/distilgpt2-emailgen-V2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|arc:challenge|25_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|drop|3_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|gsm8k|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hellaswag|10_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T13-28-50.616028.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T13-28-50.616028.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- '**/details_harness|winogrande|5_2023-11-13T13-28-50.616028.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-13T13-28-50.616028.parquet'
- config_name: results
data_files:
- split: 2023_11_13T13_28_50.616028
path:
- results_2023-11-13T13-28-50.616028.parquet
- split: latest
path:
- results_2023-11-13T13-28-50.616028.parquet
---
# Dataset Card for Evaluation run of postbot/distilgpt2-emailgen-V2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/postbot/distilgpt2-emailgen-V2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [postbot/distilgpt2-emailgen-V2](https://huggingface.co/postbot/distilgpt2-emailgen-V2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_postbot__distilgpt2-emailgen-V2_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-13T13:28:50.616028](https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__distilgpt2-emailgen-V2_public/blob/main/results_2023-11-13T13-28-50.616028.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2542066525769912,
"acc_stderr": 0.030683618404772357,
"acc_norm": 0.2547326716552163,
"acc_norm_stderr": 0.031502030622377816,
"mc1": 0.2717258261933905,
"mc1_stderr": 0.015572840452875828,
"mc2": 0.4651319733972654,
"mc2_stderr": 0.016103347289806055,
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.003143875838926175,
"f1_stderr": 0.00031171556932365637
},
"harness|arc:challenge|25": {
"acc": 0.1689419795221843,
"acc_stderr": 0.01094979565248503,
"acc_norm": 0.2098976109215017,
"acc_norm_stderr": 0.011900548748047442
},
"harness|hellaswag|10": {
"acc": 0.26598287193786097,
"acc_stderr": 0.004409521343140109,
"acc_norm": 0.26777534355706034,
"acc_norm_stderr": 0.004418948941099411
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.16447368421052633,
"acc_stderr": 0.03016753346863271,
"acc_norm": 0.16447368421052633,
"acc_norm_stderr": 0.03016753346863271
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22641509433962265,
"acc_stderr": 0.025757559893106744,
"acc_norm": 0.22641509433962265,
"acc_norm_stderr": 0.025757559893106744
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.0321473730202947,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.0321473730202947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087785,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.15,
"acc_stderr": 0.035887028128263714,
"acc_norm": 0.15,
"acc_norm_stderr": 0.035887028128263714
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.23829787234042554,
"acc_stderr": 0.02785125297388979,
"acc_norm": 0.23829787234042554,
"acc_norm_stderr": 0.02785125297388979
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924811,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924811
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15873015873015872,
"acc_stderr": 0.03268454013011743,
"acc_norm": 0.15873015873015872,
"acc_norm_stderr": 0.03268454013011743
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.20606060606060606,
"acc_stderr": 0.03158415324047707,
"acc_norm": 0.20606060606060606,
"acc_norm_stderr": 0.03158415324047707
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35858585858585856,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.35858585858585856,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178256,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178256
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128013,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128013
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.02665353159671548,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.02665353159671548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473835,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473835
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21467889908256882,
"acc_stderr": 0.01760430414925648,
"acc_norm": 0.21467889908256882,
"acc_norm_stderr": 0.01760430414925648
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2062780269058296,
"acc_stderr": 0.027157150479563824,
"acc_norm": 0.2062780269058296,
"acc_norm_stderr": 0.027157150479563824
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.20512820512820512,
"acc_stderr": 0.02645350805404035,
"acc_norm": 0.20512820512820512,
"acc_norm_stderr": 0.02645350805404035
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2656449553001277,
"acc_stderr": 0.015794302487888726,
"acc_norm": 0.2656449553001277,
"acc_norm_stderr": 0.015794302487888726
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.02207570925175717,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.02207570925175717
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.024404394928087873,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.024404394928087873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02540719779889016,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02540719779889016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432407,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676651,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676651
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.030042615832714854,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.030042615832714854
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26633986928104575,
"acc_stderr": 0.017883188134667178,
"acc_norm": 0.26633986928104575,
"acc_norm_stderr": 0.017883188134667178
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2163265306122449,
"acc_stderr": 0.026358916334904035,
"acc_norm": 0.2163265306122449,
"acc_norm_stderr": 0.026358916334904035
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.29518072289156627,
"acc_stderr": 0.035509201856896294,
"acc_norm": 0.29518072289156627,
"acc_norm_stderr": 0.035509201856896294
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2717258261933905,
"mc1_stderr": 0.015572840452875828,
"mc2": 0.4651319733972654,
"mc2_stderr": 0.016103347289806055
},
"harness|winogrande|5": {
"acc": 0.5201262825572218,
"acc_stderr": 0.01404109666434433
},
"harness|drop|3": {
"em": 0.0,
"em_stderr": 0.0,
"f1": 0.003143875838926175,
"f1_stderr": 0.00031171556932365637
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
argilla/10k_prompts_ranked_mistral_large_responses | ---
dataset_info:
features:
- name: input
dtype: string
- name: quality
list:
- name: status
dtype: string
- name: user_id
dtype: string
- name: value
dtype: string
- name: metadata
dtype: string
- name: avg_rating
dtype: float64
- name: num_responses
dtype: int64
- name: agreement_ratio
dtype: float64
- name: raw_responses
sequence: int64
- name: kind
dtype: string
- name: generation_model
sequence: string
- name: generation_prompt
list:
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_generation_responses
sequence: string
- name: generations
sequence: string
splits:
- name: train
num_bytes: 48139476
num_examples: 10331
download_size: 26098357
dataset_size: 48139476
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- synthetic
- distilabel
---
## Description
This dataset contains responses generated for the prompts of the [DIBT/10k_prompts_ranked](https://huggingface.co/datasets/DIBT/10k_prompts_ranked), using [distilabel](https://github.com/argilla-io/distilabel)
with [`mistral-large`](https://docs.mistral.ai/platform/endpoints/). The script used for the generation can be seen at the repository: `generate_reference_spin.py`. |
liuyanchen1015/VALUE_wikitext2_got | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: test
num_bytes: 96781
num_examples: 118
- name: train
num_bytes: 947742
num_examples: 1195
- name: validation
num_bytes: 78369
num_examples: 91
download_size: 705607
dataset_size: 1122892
---
# Dataset Card for "VALUE_wikitext2_got"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shuyuej/paraphrasing_cot | ---
license: apache-2.0
---
# 🚀 Load Dataset
```python
from datasets import load_dataset
dataset = load_dataset("shuyuej/paraphrasing_cot")
dataset = dataset["train"]
print(dataset)
```
|
Seenka/canal_america | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': dudoso
'1': none
'2': videograph
'3': videograph_dudoso
'4': zocalo
'5': zocalo_dudoso
- name: yolo_out
list:
- name: class
dtype: int64
- name: confidence
dtype: float64
- name: name
dtype: string
- name: xmax
dtype: float64
- name: xmin
dtype: float64
- name: ymax
dtype: float64
- name: ymin
dtype: float64
- name: cropped_image
dtype: image
- name: ocr_out
list:
- name: bbox
sequence:
sequence: float64
- name: confidence
dtype: float64
- name: text
dtype: string
- name: embeddings
sequence: float32
- name: embeddings_cropped
sequence: float32
splits:
- name: train
num_bytes: 28353802.0
num_examples: 193
download_size: 28710148
dataset_size: 28353802.0
---
# Dataset Card for "canal_america"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_cris177__DesivoMerge0.1 | ---
pretty_name: Evaluation run of cris177/DesivoMerge0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cris177/DesivoMerge0.1](https://huggingface.co/cris177/DesivoMerge0.1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cris177__DesivoMerge0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-26T05:06:30.037096](https://huggingface.co/datasets/open-llm-leaderboard/details_cris177__DesivoMerge0.1/blob/main/results_2024-01-26T05-06-30.037096.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6450871993201176,\n\
\ \"acc_stderr\": 0.032053127360967146,\n \"acc_norm\": 0.6473153652600754,\n\
\ \"acc_norm_stderr\": 0.03268995090373499,\n \"mc1\": 0.3880048959608323,\n\
\ \"mc1_stderr\": 0.01705876150134797,\n \"mc2\": 0.5536080256437423,\n\
\ \"mc2_stderr\": 0.015472900565275048\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.628839590443686,\n \"acc_stderr\": 0.014117971901142822,\n\
\ \"acc_norm\": 0.658703071672355,\n \"acc_norm_stderr\": 0.013855831287497726\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6718781119298944,\n\
\ \"acc_stderr\": 0.004685698752104803,\n \"acc_norm\": 0.8539135630352519,\n\
\ \"acc_norm_stderr\": 0.003524710243768616\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.02845015479411864,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.02845015479411864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.02354079935872329,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.02354079935872329\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.0294724858331361,\n \
\ \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.0294724858331361\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461783,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461783\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368982,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368982\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3664804469273743,\n\
\ \"acc_stderr\": 0.016115235504865467,\n \"acc_norm\": 0.3664804469273743,\n\
\ \"acc_norm_stderr\": 0.016115235504865467\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n\
\ \"acc_stderr\": 0.012738547371303957,\n \"acc_norm\": 0.46479791395045633,\n\
\ \"acc_norm_stderr\": 0.012738547371303957\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495155,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495155\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744543,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744543\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3880048959608323,\n\
\ \"mc1_stderr\": 0.01705876150134797,\n \"mc2\": 0.5536080256437423,\n\
\ \"mc2_stderr\": 0.015472900565275048\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7853196527229677,\n \"acc_stderr\": 0.011539912734345396\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5852918877937832,\n \
\ \"acc_stderr\": 0.013570623842304511\n }\n}\n```"
repo_url: https://huggingface.co/cris177/DesivoMerge0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|arc:challenge|25_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|gsm8k|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hellaswag|10_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T05-06-30.037096.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T05-06-30.037096.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- '**/details_harness|winogrande|5_2024-01-26T05-06-30.037096.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-26T05-06-30.037096.parquet'
- config_name: results
data_files:
- split: 2024_01_26T05_06_30.037096
path:
- results_2024-01-26T05-06-30.037096.parquet
- split: latest
path:
- results_2024-01-26T05-06-30.037096.parquet
---
# Dataset Card for Evaluation run of cris177/DesivoMerge0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cris177/DesivoMerge0.1](https://huggingface.co/cris177/DesivoMerge0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cris177__DesivoMerge0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-26T05:06:30.037096](https://huggingface.co/datasets/open-llm-leaderboard/details_cris177__DesivoMerge0.1/blob/main/results_2024-01-26T05-06-30.037096.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6450871993201176,
"acc_stderr": 0.032053127360967146,
"acc_norm": 0.6473153652600754,
"acc_norm_stderr": 0.03268995090373499,
"mc1": 0.3880048959608323,
"mc1_stderr": 0.01705876150134797,
"mc2": 0.5536080256437423,
"mc2_stderr": 0.015472900565275048
},
"harness|arc:challenge|25": {
"acc": 0.628839590443686,
"acc_stderr": 0.014117971901142822,
"acc_norm": 0.658703071672355,
"acc_norm_stderr": 0.013855831287497726
},
"harness|hellaswag|10": {
"acc": 0.6718781119298944,
"acc_stderr": 0.004685698752104803,
"acc_norm": 0.8539135630352519,
"acc_norm_stderr": 0.003524710243768616
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.02845015479411864,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.02845015479411864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.02354079935872329,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.02354079935872329
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.0294724858331361,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.0294724858331361
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461783,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461783
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368982,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368982
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3664804469273743,
"acc_stderr": 0.016115235504865467,
"acc_norm": 0.3664804469273743,
"acc_norm_stderr": 0.016115235504865467
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303957,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303957
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495155,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495155
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.027979823538744543,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.027979823538744543
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3880048959608323,
"mc1_stderr": 0.01705876150134797,
"mc2": 0.5536080256437423,
"mc2_stderr": 0.015472900565275048
},
"harness|winogrande|5": {
"acc": 0.7853196527229677,
"acc_stderr": 0.011539912734345396
},
"harness|gsm8k|5": {
"acc": 0.5852918877937832,
"acc_stderr": 0.013570623842304511
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
L4NLP/chunkllama-long-dialog | ---
license: cc-by-nc-4.0
tags:
- instruction-finetuning
pretty_name: LongDialogue
task_categories:
- text-generation
language:
- en
size_categories:
- 1K<n<10K
---
This dataset comprises *concatenated long conversation data*, each entry approximately 16k tokens, sourced from AlpacaGPT-4 and ShareGPT. We use [Vicuna's](https://github.com/lm-sys/FastChat) scripts to pre-process the ShareGPT data and retain only very lengthy conversations. We also keep all the samples whose responses are generated by GPT-4. Fine-tuning Llama 2 on this dataset can often yield performance comparable to that of Vicuna v1.5 with a 16k context length.
### Usage
```python
from datasets import load_dataset
dataset = load_dataset("L4NLP/chunkllama-long-dialog", split="train")
print(dataset[0].keys()) # dict_keys(['inputs' (user queries), 'outputs' (chatbot responses), 'length' (conversation length), 'source']
``` |
sachith-surge/LaMini | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: source
dtype: string
- name: response
dtype: string
- name: llama2_status
dtype: string
- name: llama2_rating
dtype: string
- name: llama2_reason
dtype: string
- name: gpt4_status
dtype: string
- name: gpt4_rating
dtype: string
- name: gpt4_reason
dtype: string
- name: falcon_status
dtype: string
- name: falcon_rating
dtype: string
- name: falcon_reason
dtype: string
splits:
- name: train
num_bytes: 3287768
num_examples: 1504
download_size: 1603115
dataset_size: 3287768
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "LaMini-LM-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML-eval-llama2-gpt4-falcon"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mHossain/processed_demo | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 11555846.1
num_examples: 27000
- name: test
num_bytes: 1283982.9
num_examples: 3000
download_size: 5616892
dataset_size: 12839829.0
---
# Dataset Card for "processed_demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/Hindi_Speech_Data_by_Mobile_Phone_Guiding | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Hindi_Speech_Data_by_Mobile_Phone_Guidinge
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/119?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The data is recorded by 397 Indian with authentic accent, 50 sentences for each speaker, total 8.6 hours. The recording content involves car scene, smart home, intelligent voice assistant. This data can be used for corpus construction of machine translation, model training and algorithm research for voiceprint recognition.
For more details, please refer to the link: https://www.nexdata.ai/datasets/119?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Hindi
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions |
Jashenco/fa-icons | ---
license: cc-by-4.0
dataset_info:
features:
- name: file_name
dtype: string
- name: tags
dtype: string
- name: image
dtype: binary
splits:
- name: train
num_bytes: 29165035
num_examples: 2045
download_size: 29058885
dataset_size: 29165035
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
shrek101202303/ai | ---
license: apache-2.0
---
|
jelber2/RustBioGPT-valid | ---
license: mit
---
```sh
git clone https://github.com/rust-bio/rust-bio-tools
rm -f RustBioGPT-validate.csv && for i in `find . -name "*.rs"`;do paste -d "," <(echo "rust-bio-tools"|perl -pe "s/(.+)/\"\1\"/g") <(echo $i|perl -pe "s/(.+)/\"\1\"/g") <(perl -pe "s/\n/\\\n/g" $i|perl -pe s"/\"/\'/g" |perl -pe "s/(.+)/\"\1\"/g") <(echo "mit"|perl -pe "s/(.+)/\"\1\"/g") >> RustBioGPT-validate.csv; done
sed -i '1i "repo_name","path","content","license"' RustBioGPT-validate.csv
``` |
result-kand2-sdxl-wuerst-karlo/c09c453b | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 195
num_examples: 10
download_size: 1353
dataset_size: 195
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "c09c453b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DatasetingBR/luazulciano | ---
license: openrail
---
|
alexparksunghyun08/transformers_sunghyun_economics | ---
license: apache-2.0
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_30 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 797718080.0
num_examples: 155440
download_size: 816841511
dataset_size: 797718080.0
---
# Dataset Card for "chunk_30"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Thewillonline/gpt4 | ---
language:
- en
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 22272464157
num_examples: 536764548
download_size: 14337362159
dataset_size: 22272464157
---
|
Santp98/Secop2_documents | ---
language:
- es
license: mit
pretty_name: Secop2 documents
dataset_info:
features:
- name: id_doc
dtype: string
- name: doc_text
dtype: string
splits:
- name: train
num_bytes: 303997310.5045912
num_examples: 13460
- name: validation
num_bytes: 101339965.24770437
num_examples: 4487
- name: test
num_bytes: 101339965.24770437
num_examples: 4487
download_size: 232995741
dataset_size: 506677241.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
tags:
- legal
---
|
open-llm-leaderboard/details_TheBloke__koala-13B-HF | ---
pretty_name: Evaluation run of TheBloke/koala-13B-HF
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/koala-13B-HF](https://huggingface.co/TheBloke/koala-13B-HF) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__koala-13B-HF\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T08:43:38.346498](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__koala-13B-HF/blob/main/results_2023-10-22T08-43-38.346498.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.021707214765100673,\n\
\ \"em_stderr\": 0.0014923686874006184,\n \"f1\": 0.09106753355704705,\n\
\ \"f1_stderr\": 0.0020580604985252385,\n \"acc\": 0.40428250097386687,\n\
\ \"acc_stderr\": 0.009634029824810052\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.021707214765100673,\n \"em_stderr\": 0.0014923686874006184,\n\
\ \"f1\": 0.09106753355704705,\n \"f1_stderr\": 0.0020580604985252385\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06823351023502654,\n \
\ \"acc_stderr\": 0.006945358944067431\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552673\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/koala-13B-HF
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T08_43_38.346498
path:
- '**/details_harness|drop|3_2023-10-22T08-43-38.346498.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T08-43-38.346498.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T08_43_38.346498
path:
- '**/details_harness|gsm8k|5_2023-10-22T08-43-38.346498.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T08-43-38.346498.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:49:04.838102.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:49:04.838102.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:49:04.838102.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T08_43_38.346498
path:
- '**/details_harness|winogrande|5_2023-10-22T08-43-38.346498.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T08-43-38.346498.parquet'
- config_name: results
data_files:
- split: 2023_07_19T18_49_04.838102
path:
- results_2023-07-19T18:49:04.838102.parquet
- split: 2023_10_22T08_43_38.346498
path:
- results_2023-10-22T08-43-38.346498.parquet
- split: latest
path:
- results_2023-10-22T08-43-38.346498.parquet
---
# Dataset Card for Evaluation run of TheBloke/koala-13B-HF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/koala-13B-HF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/koala-13B-HF](https://huggingface.co/TheBloke/koala-13B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__koala-13B-HF",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T08:43:38.346498](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__koala-13B-HF/blob/main/results_2023-10-22T08-43-38.346498.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.021707214765100673,
"em_stderr": 0.0014923686874006184,
"f1": 0.09106753355704705,
"f1_stderr": 0.0020580604985252385,
"acc": 0.40428250097386687,
"acc_stderr": 0.009634029824810052
},
"harness|drop|3": {
"em": 0.021707214765100673,
"em_stderr": 0.0014923686874006184,
"f1": 0.09106753355704705,
"f1_stderr": 0.0020580604985252385
},
"harness|gsm8k|5": {
"acc": 0.06823351023502654,
"acc_stderr": 0.006945358944067431
},
"harness|winogrande|5": {
"acc": 0.7403314917127072,
"acc_stderr": 0.012322700705552673
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
taskmaster3 | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-generation
- fill-mask
task_ids:
- dialogue-modeling
paperswithcode_id: null
pretty_name: taskmaster3
dataset_info:
features:
- name: conversation_id
dtype: string
- name: vertical
dtype: string
- name: instructions
dtype: string
- name: scenario
dtype: string
- name: utterances
list:
- name: index
dtype: int32
- name: speaker
dtype: string
- name: text
dtype: string
- name: apis
list:
- name: name
dtype: string
- name: index
dtype: int32
- name: args
list:
- name: arg_name
dtype: string
- name: arg_value
dtype: string
- name: response
list:
- name: response_name
dtype: string
- name: response_value
dtype: string
- name: segments
list:
- name: start_index
dtype: int32
- name: end_index
dtype: int32
- name: text
dtype: string
- name: annotations
list:
- name: name
dtype: string
splits:
- name: train
num_bytes: 143609327
num_examples: 23757
download_size: 313402141
dataset_size: 143609327
---
# Dataset Card for taskmaster3
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Taskmaster](https://research.google/tools/datasets/taskmaster-1/)
- **Repository:** [GitHub](https://github.com/google-research-datasets/Taskmaster/tree/master/TM-3-2020)
- **Paper:** [Taskmaster-1: Toward a Realistic and Diverse Dialog Dataset](https://arxiv.org/abs/1909.05358)
- **Leaderboard:** N/A
- **Point of Contact:** [Taskmaster Googlegroup](taskmaster-datasets@googlegroups.com)
### Dataset Summary
Taskmaster is dataset for goal oriented conversations. The Taskmaster-3 dataset consists of 23,757 movie ticketing dialogs.
By "movie ticketing" we mean conversations where the customer's goal is to purchase tickets after deciding
on theater, time, movie name, number of tickets, and date, or opt out of the transaction. This collection
was created using the "self-dialog" method. This means a single, crowd-sourced worker is
paid to create a conversation writing turns for both speakers, i.e. the customer and the ticketing agent.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The dataset is in English language.
## Dataset Structure
### Data Instances
A typical example looks like this
```
{
"conversation_id": "dlg-ddee80da-9ffa-4773-9ce7-f73f727cb79c",
"instructions": "SCENARIO: Pretend you’re *using a digital assistant to purchase tickets for a movie currently showing in theaters*. ...",
"scenario": "4 exchanges with 1 error and predefined variables",
"utterances": [
{
"apis": [],
"index": 0,
"segments": [
{
"annotations": [
{
"name": "num.tickets"
}
],
"end_index": 21,
"start_index": 20,
"text": "2"
},
{
"annotations": [
{
"name": "name.movie"
}
],
"end_index": 42,
"start_index": 37,
"text": "Mulan"
}
],
"speaker": "user",
"text": "I would like to buy 2 tickets to see Mulan."
},
{
"index": 6,
"segments": [],
"speaker": "user",
"text": "Yes.",
"apis": [
{
"args": [
{
"arg_name": "name.movie",
"arg_value": "Mulan"
},
{
"arg_name": "name.theater",
"arg_value": "Mountain AMC 16"
}
],
"index": 6,
"name": "book_tickets",
"response": [
{
"response_name": "status",
"response_value": "success"
}
]
}
]
}
],
"vertical": "Movie Tickets"
}
```
### Data Fields
Each conversation in the data file has the following structure:
- `conversation_id`: A universally unique identifier with the prefix 'dlg-'. The ID has no meaning.
- `utterances`: A list of utterances that make up the conversation.
- `instructions`: Instructions for the crowdsourced worker used in creating the conversation.
- `vertical`: In this dataset the vertical for all dialogs is "Movie Tickets".
- `scenario`: This is the title of the instructions for each dialog.
Each utterance has the following fields:
- `index`: A 0-based index indicating the order of the utterances in the conversation.
- `speaker`: Either USER or ASSISTANT, indicating which role generated this utterance.
- `text`: The raw text of the utterance. In case of self dialogs (one_person_dialogs), this is written by the crowdsourced worker. In case of the WOz dialogs, 'ASSISTANT' turns are written and 'USER' turns are transcribed from the spoken recordings of crowdsourced workers.
- `segments`: A list of various text spans with semantic annotations.
- `apis`: An array of API invocations made during the utterance.
Each API has the following structure:
- `name`: The name of the API invoked (e.g. find_movies).
- `index`: The index of the parent utterance.
- `args`: A `list` of `dict` with keys `arg_name` and `arg_value` which represent the name of the argument and the value for the argument respectively.
- `response`: A `list` of `dict`s with keys `response_name` and `response_value` which represent the name of the response and the value for the response respectively.
Each segment has the following fields:
- `start_index`: The position of the start of the annotation in the utterance text.
- `end_index`: The position of the end of the annotation in the utterance text.
- `text`: The raw text that has been annotated.
- `annotations`: A list of annotation details for this segment.
Each annotation has a single field:
- `name`: The annotation name.
### Data Splits
There are no deafults splits for all the config. The below table lists the number of examples in each config.
| | Train |
|-------------------|--------|
| n_instances | 23757 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
The dataset is licensed under `Creative Commons Attribution 4.0 License`
### Citation Information
[More Information Needed]
```
@inproceedings{48484,
title = {Taskmaster-1: Toward a Realistic and Diverse Dialog Dataset},
author = {Bill Byrne and Karthik Krishnamoorthi and Chinnadhurai Sankar and Arvind Neelakantan and Daniel Duckworth and Semih Yavuz and Ben Goodrich and Amit Dubey and Kyu-Young Kim and Andy Cedilnik},
year = {2019}
}
```
### Contributions
Thanks to [@patil-suraj](https://github.com/patil-suraj) for adding this dataset. |
adas100/resume_dataset1 | ---
dataset_info:
features:
- name: resume_str
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 12496606
num_examples: 1987
- name: test
num_bytes: 1631312
num_examples: 248
- name: validation
num_bytes: 1604207
num_examples: 249
download_size: 7940604
dataset_size: 15732125
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
dphaps/asss | ---
license: creativeml-openrail-m
---
|
nielsr/datacomp-small-10-rows-with-image-feature | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: uid
dtype: string
- name: url
dtype: string
- name: text
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 350718.0
num_examples: 10
download_size: 362017
dataset_size: 350718.0
---
# Dataset Card for "datacomp-small-10-rows-with-image-feature"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nietras/1brc.data | ---
license: apache-2.0
language:
- en
pretty_name: 1️⃣🐝🏎️ The One Billion Row Challenge - Data
size_categories:
- 1B<n<10B
viewer: false
---
# 1brc.data
1️⃣🐝🏎️ The One Billion Row Challenge - Data files only if you can't be bothered with Java
or write a generator but would rather download +3GB 😁
See https://github.com/gunnarmorling/1brc for original Java challenge.
Large data files have been generated on my machine and given official Java generator is random
(not seeded and hence not deterministic) will be different than official files.
+10 million row files have been compressed with [7-zip](https://www.7-zip.org/) and split in 1GB volumes if needed.
Those files have been uploaded using git lfs.
Expected output (`.out`) have been generated using Java implementations. Nothing has been done to verify results are correct.
Other files are from https://github.com/gunnarmorling/1brc/tree/main/src/test/resources/samples with expected outputs.
Download or clone with:
```
git clone https://huggingface.co/datasets/nietras/1brc.data
``` |
Doub7e/SDv2-Count-Repeated-8 | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
- name: T5_last_hidden_states
sequence:
sequence:
sequence: float32
- name: style
dtype: string
splits:
- name: train
num_bytes: 1506007458.25
num_examples: 1150
download_size: 1314250724
dataset_size: 1506007458.25
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
seanpedrickcase/test_address_match_data | ---
license: apache-2.0
---
|
Manan28/optimized_financial-phrasebank_test | ---
dataset_info:
features:
- name: label
dtype: string
- name: sentence1
dtype: string
splits:
- name: test
num_bytes: 119955
num_examples: 485
download_size: 48572
dataset_size: 119955
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
reaganjlee/boolq_fr | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: question
dtype: string
- name: passage
dtype: string
- name: answer
dtype:
class_label:
names:
'0': 'False'
'1': 'True'
splits:
- name: train
num_bytes: 4295148
num_examples: 9427
- name: validation
num_bytes: 1485986
num_examples: 3270
download_size: 3536844
dataset_size: 5781134
---
# Dataset Card for "boolq_fr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nbalepur/mcqa_artifacts | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: dataset
dtype: string
- name: question
dtype: string
- name: choices
sequence: string
- name: answer_letter
dtype: string
splits:
- name: train
num_bytes: 140227
num_examples: 325
- name: test
num_bytes: 15839703
num_examples: 26506
download_size: 8684910
dataset_size: 15979930
---
# Dataset Card for "mcqa_artifacts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hopee4/peter | ---
license: openrail
---
|
liuyanchen1015/MULTI_VALUE_mrpc_completive_finish | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 46199
num_examples: 157
- name: train
num_bytes: 108748
num_examples: 385
- name: validation
num_bytes: 12903
num_examples: 46
download_size: 119292
dataset_size: 167850
---
# Dataset Card for "MULTI_VALUE_mrpc_completive_finish"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lmg-anon/VNTL-v2-1k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 24744504
num_examples: 10260
- name: val
num_bytes: 3716994
num_examples: 1566
download_size: 12528579
dataset_size: 28461498
---
# Dataset Card for "VNTL-v2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Neel-Gupta/minipile-processed_32 | ---
dataset_info:
features:
- name: text
sequence:
sequence:
sequence: int64
splits:
- name: train
num_bytes: 77148736
num_examples: 196
- name: test
num_bytes: 25191424
num_examples: 64
download_size: 9839350
dataset_size: 102340160
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
AppleHarem/maury_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of maury (Azur Lane)
This is the dataset of maury (Azur Lane), containing 18 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 18 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 45 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 45 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 18 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 18 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 18 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 45 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 45 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 21 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 45 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 45 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
foduucom/table-detection-yolo | ---
task_categories:
- object-detection
tags:
- foduuai
- table
- Documents
- bordered table
- borderless table
- unstructured document
language:
- en
pretty_name: TableBorderNet
size_categories:
- 1K<n<10K
---
<div align="center">
<img width="640" alt="foduucom/table-detection-yolo" src="https://huggingface.co/datasets/foduucom/table-detection-yolo/resolve/main/thumbnail.jpg">
</div>
### Dataset Labels
```
['bordered', 'borderless']
```
### Number of Images
```json
{'test': 34, 'train': 238, 'valid': 70}
```
### How to Use
- Install [datasets](https://pypi.org/project/datasets/):
```bash
pip install datasets
```
- Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("foduucom/table-detection-yolo", name="full")
example = ds['train'][0]
```
### Dataset Summary
Certainly! Here's a dataset summary for your dataset of images containing tables that are categorized as border and borderless, provided in YOLO format:
## Dataset Summary
The **Table Detection Dataset** is a curated collection of images, each depicting tables that are classified as either 'bordered' or 'borderless'. The dataset is provided in YOLO format, featuring annotations for accurate object detection and classification. It serves as a valuable resource for researchers, developers, and practitioners working on table detection tasks, with a specific focus on distinguishing between tables with distinct visual characteristics.
**Key Features:**
- **Image Variety:** The dataset encompasses a diverse range of images, capturing tables from various real-world scenarios and environments.
- **Annotation Precision:** Each image is meticulously annotated with bounding box coordinates and class labels, indicating whether the table is 'bordered' or 'borderless'.
- **YOLO Format:** Annotations follow the YOLO format, making it suitable for training and evaluating object detection models.
- **Research and Development:** The dataset is designed to facilitate advancements in table detection algorithms and technologies, enabling the development of models capable of accurately identifying and classifying different types of tables.
Whether you are working on document analysis, data extraction, or image-based content recognition, the Table Detection Dataset provides an essential foundation for enhancing the capabilities of object detection models in identifying tables with varying visual attributes. By offering a comprehensive collection of border and borderless tables, this dataset empowers the AI community to tackle challenges in table detection across a wide range of applications.
For more details and access to the dataset, please refer to info@foduu.com . |
zolak/twitter_dataset_78_1713126377 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 314394
num_examples: 773
download_size: 160407
dataset_size: 314394
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pphildan/inferno_train3 | ---
dataset_info:
features:
- name: data
sequence:
sequence:
sequence: float32
- name: target
sequence:
sequence: int64
splits:
- name: train
num_bytes: 1267120
num_examples: 10
download_size: 1000243
dataset_size: 1267120
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "inferno_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arthurmluz/wikilingua_data-wiki_gptextsum_results | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: summary
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
splits:
- name: validation
num_bytes: 22119069
num_examples: 8165
download_size: 13030706
dataset_size: 22119069
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "wikilingua_data-wiki_gptextsum_results"
rouge= {'rouge1': 0.33197745689950786, 'rouge2': 0.12351364096382854, 'rougeL': 0.24922924859888235, 'rougeLsum': 0.24922924859888235}
bert= {'precision': 0.761901833262096, 'recall': 0.7622503304255206, 'f1': 0.7612595360336023} |
chitradrishti/AADB | ---
license: unknown
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.