datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Nan-Do/instructional_code-search-net-javacript | ---
dataset_info:
features:
- name: INSTRUCTION
dtype: string
- name: RESPONSE
dtype: string
- name: SOURCE
dtype: string
splits:
- name: train
num_bytes: 126970947
num_examples: 121323
download_size: 49942966
dataset_size: 126970947
license: apache-2.0
task_categories:
- conversational
- text-generation
- text2text-generation
language:
- en
tags:
- JavaScript
- Code Generation
- Instruction Response
pretty_name: Instructional JavaScript Dataset
---
# Dataset Card for "instructional_code-search-net-javacript"
## Dataset Description
- **Homepage:** None
- **Repository:** https://huggingface.co/datasets/Nan-Do/instructional_code-search-net-javascript
- **Paper:** None
- **Leaderboard:** None
- **Point of Contact:** [@Nan-Do](https://github.com/Nan-Do)
### Dataset Summary
This is an instructional dataset for JavaScript.
The dataset contains two different kind of tasks:
- Given a piece of code generate a description of what it does.
- Given a description generate a piece of code that fulfils the description.
### Languages
The dataset is in English.
### Data Splits
There are no splits.
## Dataset Creation
May of 2023
### Curation Rationale
This dataset was created to improve the coding capabilities of LLMs.
### Source Data
The summarized version of the code-search-net dataset can be found at https://huggingface.co/datasets/Nan-Do/code-search-net-javascript
### Annotations
The dataset includes an instruction and response columns.
#### Annotation process
The annotation procedure was done using templates and NLP techniques to generate human-like instructions and responses.
A sample notebook of the process can be found at https://github.com/Nan-Do/OpenAssistantInstructionResponsePython
The annontations have been cleaned to make sure there are no repetitions and/or meaningless summaries.
### Licensing Information
Apache 2.0 |
open-llm-leaderboard/details_uukuguy__speechless-codellama-platypus-13b | ---
pretty_name: Evaluation run of uukuguy/speechless-codellama-platypus-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/speechless-codellama-platypus-13b](https://huggingface.co/uukuguy/speechless-codellama-platypus-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-codellama-platypus-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T04:45:04.706301](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-platypus-13b/blob/main/results_2023-10-25T04-45-04.706301.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.008494127516778523,\n\
\ \"em_stderr\": 0.0009398243325411525,\n \"f1\": 0.05910234899328872,\n\
\ \"f1_stderr\": 0.001500499797469734,\n \"acc\": 0.3734290226086707,\n\
\ \"acc_stderr\": 0.010636722374509789\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.008494127516778523,\n \"em_stderr\": 0.0009398243325411525,\n\
\ \"f1\": 0.05910234899328872,\n \"f1_stderr\": 0.001500499797469734\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09097801364670205,\n \
\ \"acc_stderr\": 0.007921322844013643\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6558800315706393,\n \"acc_stderr\": 0.013352121905005935\n\
\ }\n}\n```"
repo_url: https://huggingface.co/uukuguy/speechless-codellama-platypus-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|arc:challenge|25_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|arc:challenge|25_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T18_50_42.836793
path:
- '**/details_harness|drop|3_2023-10-18T18-50-42.836793.parquet'
- split: 2023_10_25T04_45_04.706301
path:
- '**/details_harness|drop|3_2023-10-25T04-45-04.706301.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T04-45-04.706301.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T18_50_42.836793
path:
- '**/details_harness|gsm8k|5_2023-10-18T18-50-42.836793.parquet'
- split: 2023_10_25T04_45_04.706301
path:
- '**/details_harness|gsm8k|5_2023-10-25T04-45-04.706301.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T04-45-04.706301.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hellaswag|10_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hellaswag|10_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T18_50_42.836793
path:
- '**/details_harness|winogrande|5_2023-10-18T18-50-42.836793.parquet'
- split: 2023_10_25T04_45_04.706301
path:
- '**/details_harness|winogrande|5_2023-10-25T04-45-04.706301.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T04-45-04.706301.parquet'
- config_name: results
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- results_2023-08-31T15:51:18.379129.parquet
- split: 2023_09_12T15_51_14.957387
path:
- results_2023-09-12T15-51-14.957387.parquet
- split: 2023_10_18T18_50_42.836793
path:
- results_2023-10-18T18-50-42.836793.parquet
- split: 2023_10_25T04_45_04.706301
path:
- results_2023-10-25T04-45-04.706301.parquet
- split: latest
path:
- results_2023-10-25T04-45-04.706301.parquet
---
# Dataset Card for Evaluation run of uukuguy/speechless-codellama-platypus-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/speechless-codellama-platypus-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/speechless-codellama-platypus-13b](https://huggingface.co/uukuguy/speechless-codellama-platypus-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-codellama-platypus-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T04:45:04.706301](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-platypus-13b/blob/main/results_2023-10-25T04-45-04.706301.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.008494127516778523,
"em_stderr": 0.0009398243325411525,
"f1": 0.05910234899328872,
"f1_stderr": 0.001500499797469734,
"acc": 0.3734290226086707,
"acc_stderr": 0.010636722374509789
},
"harness|drop|3": {
"em": 0.008494127516778523,
"em_stderr": 0.0009398243325411525,
"f1": 0.05910234899328872,
"f1_stderr": 0.001500499797469734
},
"harness|gsm8k|5": {
"acc": 0.09097801364670205,
"acc_stderr": 0.007921322844013643
},
"harness|winogrande|5": {
"acc": 0.6558800315706393,
"acc_stderr": 0.013352121905005935
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AnalyticalRecon/lus_en | ---
license: mit
task_categories:
- translation
language:
- en
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/temari_naruto | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of temari (NARUTO)
This is the dataset of temari (NARUTO), containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
AutoPlow/test | ---
license: mit
dataset_info:
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
struct:
- name: bytes
dtype: 'null'
- name: path
dtype: string
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 4749424
num_examples: 7932
download_size: 705920
dataset_size: 4749424
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_AurelPx__Dare-k-7B-ties | ---
pretty_name: Evaluation run of AurelPx/Dare-k-7B-ties
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AurelPx/Dare-k-7B-ties](https://huggingface.co/AurelPx/Dare-k-7B-ties) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AurelPx__Dare-k-7B-ties\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T01:51:21.252099](https://huggingface.co/datasets/open-llm-leaderboard/details_AurelPx__Dare-k-7B-ties/blob/main/results_2024-03-22T01-51-21.252099.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6549505412275739,\n\
\ \"acc_stderr\": 0.0318457403527982,\n \"acc_norm\": 0.6545951827427666,\n\
\ \"acc_norm_stderr\": 0.03250570751010229,\n \"mc1\": 0.47123623011015914,\n\
\ \"mc1_stderr\": 0.017474513848525525,\n \"mc2\": 0.639077073664674,\n\
\ \"mc2_stderr\": 0.015151703268240214\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6680887372013652,\n \"acc_stderr\": 0.01376098820088054,\n\
\ \"acc_norm\": 0.6911262798634812,\n \"acc_norm_stderr\": 0.013501770929344003\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6890061740689106,\n\
\ \"acc_stderr\": 0.004619542392006391,\n \"acc_norm\": 0.8708424616610237,\n\
\ \"acc_norm_stderr\": 0.0033468881153524573\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7320754716981132,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.7320754716981132,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.03514942551267439,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.03514942551267439\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356853,\n \"\
acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356853\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660836,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660836\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n\
\ \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n\
\ \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503217,\n\
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503217\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624734,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624734\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8390804597701149,\n\
\ \"acc_stderr\": 0.013140225515611724,\n \"acc_norm\": 0.8390804597701149,\n\
\ \"acc_norm_stderr\": 0.013140225515611724\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n\
\ \"acc_stderr\": 0.016495400635820084,\n \"acc_norm\": 0.41787709497206704,\n\
\ \"acc_norm_stderr\": 0.016495400635820084\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042103,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042103\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46088657105606257,\n\
\ \"acc_stderr\": 0.012731102790504515,\n \"acc_norm\": 0.46088657105606257,\n\
\ \"acc_norm_stderr\": 0.012731102790504515\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.01890101532209309,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.01890101532209309\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47123623011015914,\n\
\ \"mc1_stderr\": 0.017474513848525525,\n \"mc2\": 0.639077073664674,\n\
\ \"mc2_stderr\": 0.015151703268240214\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8200473559589582,\n \"acc_stderr\": 0.01079646868806868\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7156937073540561,\n \
\ \"acc_stderr\": 0.01242507818839598\n }\n}\n```"
repo_url: https://huggingface.co/AurelPx/Dare-k-7B-ties
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|arc:challenge|25_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|gsm8k|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hellaswag|10_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T01-51-21.252099.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T01-51-21.252099.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- '**/details_harness|winogrande|5_2024-03-22T01-51-21.252099.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T01-51-21.252099.parquet'
- config_name: results
data_files:
- split: 2024_03_22T01_51_21.252099
path:
- results_2024-03-22T01-51-21.252099.parquet
- split: latest
path:
- results_2024-03-22T01-51-21.252099.parquet
---
# Dataset Card for Evaluation run of AurelPx/Dare-k-7B-ties
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AurelPx/Dare-k-7B-ties](https://huggingface.co/AurelPx/Dare-k-7B-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AurelPx__Dare-k-7B-ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T01:51:21.252099](https://huggingface.co/datasets/open-llm-leaderboard/details_AurelPx__Dare-k-7B-ties/blob/main/results_2024-03-22T01-51-21.252099.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6549505412275739,
"acc_stderr": 0.0318457403527982,
"acc_norm": 0.6545951827427666,
"acc_norm_stderr": 0.03250570751010229,
"mc1": 0.47123623011015914,
"mc1_stderr": 0.017474513848525525,
"mc2": 0.639077073664674,
"mc2_stderr": 0.015151703268240214
},
"harness|arc:challenge|25": {
"acc": 0.6680887372013652,
"acc_stderr": 0.01376098820088054,
"acc_norm": 0.6911262798634812,
"acc_norm_stderr": 0.013501770929344003
},
"harness|hellaswag|10": {
"acc": 0.6890061740689106,
"acc_stderr": 0.004619542392006391,
"acc_norm": 0.8708424616610237,
"acc_norm_stderr": 0.0033468881153524573
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7320754716981132,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.7320754716981132,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.03514942551267439,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.03514942551267439
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660836,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660836
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503217,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503217
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624734,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624734
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179326,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8390804597701149,
"acc_stderr": 0.013140225515611724,
"acc_norm": 0.8390804597701149,
"acc_norm_stderr": 0.013140225515611724
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41787709497206704,
"acc_stderr": 0.016495400635820084,
"acc_norm": 0.41787709497206704,
"acc_norm_stderr": 0.016495400635820084
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042103,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46088657105606257,
"acc_stderr": 0.012731102790504515,
"acc_norm": 0.46088657105606257,
"acc_norm_stderr": 0.012731102790504515
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.01890101532209309,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.01890101532209309
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827072,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827072
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47123623011015914,
"mc1_stderr": 0.017474513848525525,
"mc2": 0.639077073664674,
"mc2_stderr": 0.015151703268240214
},
"harness|winogrande|5": {
"acc": 0.8200473559589582,
"acc_stderr": 0.01079646868806868
},
"harness|gsm8k|5": {
"acc": 0.7156937073540561,
"acc_stderr": 0.01242507818839598
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
sradc/chunked-wikipedia20220301en-bookcorpusopen | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 26076989556
num_examples: 33536113
download_size: 15221565467
dataset_size: 26076989556
---
# Dataset Card for "chunked-wikipedia20220301en-bookcorpusopen"
```
num_examples: 33.5 million
download_size: 15.3 GB
dataset_size: 26.1 GB
```
This dataset combines [wikipedia20220301.en](https://huggingface.co/datasets/wikipedia) and [bookcorpusopen](https://huggingface.co/datasets/bookcorpusopen),
and splits the data into smaller chunks, of size ~820 chars
(such that each item will be at least ~128 tokens for the average tokenizer).
The logic only splits on spaces, so the chunks are likely to be slightly larger than 820 chars.
The dataset has been normalized into lower case, with accents and non-english characters removed.
Items with less than 200 chars or more than 1000 chars have been removed.
The data has not been shuffled (you can either use `dataset.shuffle(...)`,
or download the shuffled version [here](https://huggingface.co/datasets/sradc/chunked-shuffled-wikipedia20220301en-bookcorpusopen),
which will be faster to iterate over).
This dataset is processed for convenience, at the expense of losing some percentage of the tokens due to truncation,
(assuming the training minibatches are truncated to 128 tokens). |
Tochi2023/kolizo-designs-dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: ' text'
dtype: string
splits:
- name: train
num_bytes: 200160.0
num_examples: 10
download_size: 197786
dataset_size: 200160.0
---
# Dataset Card for "kolizo-designs-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_57_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 6945707
num_examples: 13323
download_size: 3386882
dataset_size: 6945707
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_57_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-high_school_geography-dev | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: dev
num_bytes: 2734
num_examples: 5
download_size: 0
dataset_size: 2734
---
# Dataset Card for "mmlu-high_school_geography-dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlekseyKorshuk/CS1QACensoredClassEval-WizardCoder-3B-V1.0-dpo-beta-0.01annotations | ---
dataset_info:
features:
- name: model_input
list:
- name: content
dtype: string
- name: role
dtype: string
- name: baseline_response
dtype: string
- name: response
dtype: string
- name: annotation
dtype: float64
splits:
- name: train
num_bytes: 324216
num_examples: 100
download_size: 116664
dataset_size: 324216
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
chiHang/online-sweater | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 1126581.0
num_examples: 10
download_size: 0
dataset_size: 1126581.0
---
# Dataset Card for "online-sweater"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
revanks/Sentence_review_100 | ---
license: apache-2.0
---
|
parksimon0808/test | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 64
num_examples: 2
download_size: 1726
dataset_size: 64
---
# Dataset Card for "test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nibbz2024/SurveySense-datasetV2 | ---
language:
- en
dataset_info:
features:
- name: Concept
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 41104
num_examples: 40
download_size: 22231
dataset_size: 41104
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
irds/lotte_writing_test | ---
pretty_name: '`lotte/writing/test`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `lotte/writing/test`
The `lotte/writing/test` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/lotte#lotte/writing/test).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=199,994
This dataset is used by: [`lotte_writing_test_forum`](https://huggingface.co/datasets/irds/lotte_writing_test_forum), [`lotte_writing_test_search`](https://huggingface.co/datasets/irds/lotte_writing_test_search)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/lotte_writing_test', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Santhanam2021ColBERTv2,
title = "ColBERTv2: Effective and Efficient Retrieval via Lightweight Late Interaction",
author = "Keshav Santhanam and Omar Khattab and Jon Saad-Falcon and Christopher Potts and Matei Zaharia",
journal= "arXiv preprint arXiv:2112.01488",
year = "2021",
url = "https://arxiv.org/abs/2112.01488"
}
```
|
GEM-submissions/lewtun__this-is-a-test-name__1655900658 | ---
benchmark: gem
type: prediction
submission_name: This is a test name
tags:
- evaluation
- benchmark
---
# GEM Submission
Submission name: This is a test name
|
rmihiranga/sinhala-text-fullfill-v3 | ---
dataset_info:
features:
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 2409515
num_examples: 471
download_size: 675738
dataset_size: 2409515
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "sinhala-text-fullfill-v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yahoo_answers_topics | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 1M<n<10M
source_datasets:
- extended|other-yahoo-answers-corpus
task_categories:
- text-classification
task_ids:
- topic-classification
pretty_name: YahooAnswersTopics
dataset_info:
features:
- name: id
dtype: int32
- name: topic
dtype:
class_label:
names:
'0': Society & Culture
'1': Science & Mathematics
'2': Health
'3': Education & Reference
'4': Computers & Internet
'5': Sports
'6': Business & Finance
'7': Entertainment & Music
'8': Family & Relationships
'9': Politics & Government
- name: question_title
dtype: string
- name: question_content
dtype: string
- name: best_answer
dtype: string
config_name: yahoo_answers_topics
splits:
- name: train
num_bytes: 760460695
num_examples: 1400000
- name: test
num_bytes: 32661362
num_examples: 60000
download_size: 319476345
dataset_size: 793122057
train-eval-index:
- config: yahoo_answers_topics
task: text-classification
task_id: multi_class_classification
splits:
train_split: train
eval_split: test
col_mapping:
question_content: text
topic: target
metrics:
- type: accuracy
name: Accuracy
- type: f1
name: F1 macro
args:
average: macro
- type: f1
name: F1 micro
args:
average: micro
- type: f1
name: F1 weighted
args:
average: weighted
- type: precision
name: Precision macro
args:
average: macro
- type: precision
name: Precision micro
args:
average: micro
- type: precision
name: Precision weighted
args:
average: weighted
- type: recall
name: Recall macro
args:
average: macro
- type: recall
name: Recall micro
args:
average: micro
- type: recall
name: Recall weighted
args:
average: weighted
---
# Dataset Card for "Yahoo Answers Topics"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Add homepage URL here if available (unless it's a GitHub repository)]()
- **Repository:** https://github.com/LC-John/Yahoo-Answers-Topic-Classification-Dataset
- **Paper:** [If the dataset was introduced by a paper or there was a paper written describing the dataset, add URL here (landing page for Arxiv paper preferred)]()
- **Leaderboard:** [If the dataset supports an active leaderboard, add link here]()
- **Point of Contact:** [If known, name and email of at least one person the reader can contact for questions about the dataset.]()
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@patil-suraj](https://github.com/patil-suraj) for adding this dataset. |
iapp_wiki_qa_squad | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- th
license:
- mit
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- extended|other-iapp-wiki-qa-dataset
task_categories:
- question-answering
task_ids:
- extractive-qa
- open-domain-qa
paperswithcode_id: null
pretty_name: IappWikiQaSquad
dataset_info:
features:
- name: question_id
dtype: string
- name: article_id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer_end
dtype: int32
config_name: iapp_wiki_qa_squad
splits:
- name: train
num_bytes: 16107541
num_examples: 5761
- name: validation
num_bytes: 2120768
num_examples: 742
- name: test
num_bytes: 2032016
num_examples: 739
download_size: 2876630
dataset_size: 20260325
---
# Dataset Card for `iapp_wiki_qa_squad`
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/iapp-technology/iapp-wiki-qa-dataset
- **Repository:** https://github.com/iapp-technology/iapp-wiki-qa-dataset
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** https://github.com/iapp-technology/iapp-wiki-qa-dataset
### Dataset Summary
`iapp_wiki_qa_squad` is an extractive question answering dataset from Thai Wikipedia articles. It is adapted from [the original iapp-wiki-qa-dataset](https://github.com/iapp-technology/iapp-wiki-qa-dataset) to [SQuAD](https://rajpurkar.github.io/SQuAD-explorer/) format, resulting in 5761/742/739 questions from 1529/191/192 articles.
### Supported Tasks and Leaderboards
extractive question answering
### Languages
Thai
## Dataset Structure
### Data Instances
An example from the dataset:
```
{'article_id': '0U2lA8nJQESIxbZrjZQc',
'question_id': '0U2lA8nJQESIxbZrjZQc_000',
'context': 'นายสุวัฒน์ วรรณศิริกุล (1 พฤศจิกายน พ.ศ. 2476 - 31 กรกฎาคม พ.ศ. 2555) อดีตรองหัวหน้าพรรคพลังประชาชน อดีตประธานสมาชิกสภาผู้แทนราษฎร และประธานภาคกรุงเทพมหานคร พรรคพลังประชาชน อดีตสมาชิกสภาผู้แทนราษฎรกรุงเทพมหานครหลายสมัย ได้รับการเลือกตั้งเป็นสมาชิกสภาผู้แทนราษฎรครั้งแรกในปี พ.ศ. 2529 ในสังกัดพรรคประชากรไทย และสังกัดพรรคพลังประชาชน เป็นพรรคสุดท้าย',
'question': 'สุวัฒน์ วรรณศิริกุล เกิดวันที่เท่าไร',
'answers': {'text': ['1 พฤศจิกายน พ.ศ. 2476'],
'answer_start': [24],
'answer_end': [45]},
'title': 'สุวัฒน์ วรรณศิริกุล',
'created_by': 'gmnjGRF0y0g7QRZDd9Qgz3AgiHJ3',
'created_on': '2019-08-18 05:05:51.358000+00:00',
'is_pay': {'date': None, 'status': False}}
{'article_id': '01KZTrxgvC5mOovXFMPJ',
'question_id': '01KZTrxgvC5mOovXFMPJ_000',
'context': 'พัทธ์ธีรา ศรุติพงศ์โภคิน (เกิด 3 ธันวาคม พ.ศ. 2533) หรือชื่อเล่นว่า อร เป็นนักแสดงหญิงชาวไทย สำเร็จมัธยมศึกษาจากCatholic Cathedral College ประเทศนิวซีแลนด์ และปริญญาตรีจากRaffles International College สาขา Business Marketing\n\nเข้าสู่วงการตั้งแต่อายุ 6 ขวบ จากการแสดงละครเวทีกับ ครูชลประคัลภ์ จันทร์เรือง จากนั้นก็เล่นโฆษณาในวัยเด็ก 2- 3 ชิ้น และยังเคยแสดงช่วงละครสั้น ในรายการซุปเปอร์จิ๋ว ประมาณปี 2542\n\nปัจจุบันเป็นทั้ง นักแสดง , พิธีกร และ วีเจ อยู่ที่คลื่น เก็ท 102.5 Bangkok International Hits Music Station และยังเป็นพิธีกรให้กับช่อง ทรู มิวสิก',
'question': 'พัทธ์ธีรา ศรุติพงศ์โภคิน เกิดวันที่เท่าไร',
'answers': {'text': ['3 ธันวาคม พ.ศ. 2533'],
'answer_start': [31],
'answer_end': [50]},
'title': 'พัทธ์ธีรา ศรุติพงศ์โภคิน',
'created_by': 'gmnjGRF0y0g7QRZDd9Qgz3AgiHJ3',
'created_on': '2019-08-07 14:00:38.778000+00:00',
'is_pay': {'status': True,
'total': 2.5,
'date': '2019-08-13 10:47:28.095000+00:00'}}
```
### Data Fields
```
{
"question_id": question id
"article_id": article id
"title": article title
"context": article texts
"question": question
"answers":
{
"text": answer text
"answer_start": answer beginning position
"answer_end": answer exclusive upper bound position
}
),
}
```
### Data Splits
| | train | valid | test |
|-------------|-------|-------|------|
| # questions | 5761 | 742 | 739 |
| # articles | 1529 | 191 | 192 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
From the original `iapp-wiki-qa-dataset`, [@cstorm125](https://github.com/cstorm125/) applied the following processing:
- Select questions with one, non-empty answer
- Select questions whose answers match `textDetection` fields
- Select questions whose answers are 100-character long or shorter
- 80/10/10 train-validation-split at article level
#### Who are the source language producers?
Wikipedia authors for contexts and annotators hired by [iApp](https://iapp.co.th/) for questions and answer annotations
### Annotations
#### Annotation process
Annotators hired by [iApp](https://iapp.co.th/) are asked create questions and answers for each article.
#### Who are the annotators?
Annotators hired by [iApp](https://iapp.co.th/)
### Personal and Sensitive Information
All contents are from Wikipedia. No personal and sensitive information is expected to be included.
## Considerations for Using the Data
### Social Impact of Dataset
- open-domain, extractive question answering in Thai
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Original dataset by [iApp](https://iapp.co.th/). SQuAD formattting by [PyThaiNLP](https://github.com/PyThaiNLP/).
### Licensing Information
MIT
### Citation Information
```
@dataset{kobkrit_viriyayudhakorn_2021_4539916,
author = {Kobkrit Viriyayudhakorn and
Charin Polpanumas},
title = {iapp\_wiki\_qa\_squad},
month = feb,
year = 2021,
publisher = {Zenodo},
version = 1,
doi = {10.5281/zenodo.4539916},
url = {https://doi.org/10.5281/zenodo.4539916}
}
```
### Contributions
Thanks to [@cstorm125](https://github.com/cstorm125) for adding this dataset. |
tyzhu/v1.1_id0.2_context_instruction_tuning | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task_source
dtype: string
- name: task_name
dtype: string
- name: template_type
dtype: string
- name: context
dtype: string
- name: template_used
dtype: string
splits:
- name: train
num_bytes: 1154915040.1878934
num_examples: 437288
- name: eval_context
num_bytes: 38006832.85245361
num_examples: 13944
- name: eval_id_context
num_bytes: 10843981
num_examples: 5976
download_size: 237906027
dataset_size: 1203765854.040347
---
# Dataset Card for "v1.1_id0.2_context_instruction_tuning"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
allenai/commongen_lite | ---
configs:
- config_name: all
data_files:
- split: train
- path: commongen_hard_nohuman.json
license: apache-2.0
task_categories:
- text-generation
language:
- en
pretty_name: CommonGen-Lite
size_categories:
- n<1K
datasets:
- commongen
---
# CommonGen-Lite
Evaluating LLMs with [CommonGen](https://inklab.usc.edu/CommonGen/) using [CommonGen-lite](https://huggingface.co/datasets/allenai/commongen_lite) dataset (400 examples + 900 human references).
We use GPT-4 to evaluate the constrained text generation ability of LLMs. Please see more in our [paper](https://arxiv.org/abs/1911.03705).
Github: [https://github.com/allenai/CommonGen-Eval](https://github.com/allenai/CommonGen-Eval)
## Leaderboard
| model | len | cover | pos | win_tie | overall |
|----------------------------|-------|---------|-------|-----------|-----------|
| human | 12.84 | 99.00 | 98.11 | 100.00 | 97.13 |
| gpt-4-0613 | 14.13 | 97.44 | 91.78 | 50.44 | 45.11 |
| gpt-4-1106-preview | 14.90 | 96.33 | 90.11 | 50.78 | 44.08 |
| gpt-3.5-turbo | 12.76 | 92.11 | 83.00 | 49.78 | 38.06 |
| Yi-34b-chat | 13.45 | 80.11 | 75.11 | 39.44 | 23.73 |
| vicuna-13b-v1.5 | 15.02 | 85.89 | 79.56 | 27.44 | 18.75 |
| tulu-2-dpo-70b | 17.89 | 88.78 | 80.11 | 23.00 | 16.36 |
| Mixtral-8x7B-Instruct-v0.1 | 20.15 | 84.11 | 73.33 | 17.89 | 11.03 |
| Llama-2-7b-chat-hf | 16.06 | 88.56 | 76.44 | 15.44 | 10.45 |
| zephyr-7b-beta | 15.76 | 82.44 | 72.78 | 16.89 | 10.13 |
| Yi-6b-chat | 13.32 | 71.67 | 63.56 | 22.11 | 10.07 |
- **length**: the number of words on average in the generated sentences
- **cover**: the percentage of examples where all given concepts are covered by model outputs
- **PoS**: the percentage of examples where the part-of-speech (PoS) of ALL given concepts are correct in model outputs
- **win_tie**: the percentage of examples where GPT-4-turbo prefers the model outputs over the human-written references (or thinks they are equally good)
- **overall**: `cover%` x `pos%` x `win_tie%`
Link: [https://inklab.usc.edu/CommonGen/leaderboard.html](https://inklab.usc.edu/CommonGen/leaderboard.html)
To make your model on the leaderboard, please create an issue or PR to submit the inference script. I'll run the following evaluation script and update the leaderboard.
You will not need to run the evaluation script yourself (the human references are hidden to avoid data leakage).
## Prompts
<details>
<summary>Instruction Prompt Template (2-shots prompting)</summary>
```markdown
# Instruction
Given several concepts (i.e., nouns or verbs), write a short and simple sentence that contains *all* the required words.
The sentence should describe a common scene in daily life, and the concepts should be used in a natural way.
# Examples
## Example 1
- Concepts: "dog(noun), frisbee(noun), catch(verb), throw(verb)"
- Sentence: The dog catches the frisbee when the boy throws it into the air.
## Example 2
- Concepts: "apple(noun), place(verb), tree(noun), pick(verb)"
- Sentence: A girl picks some apples from a tree and places them into her basket.
# Your Task
- Concepts: "{$concept_list}"
- Sentence:
```
</details>
<details>
<summary>Evaluation Prompt Template (Pairwise Comparison)</summary>
```markdown
# Data
Given several concepts (i.e., nouns or verbs), we ask models to write a short and simple sentence that contains *all* the required words.
The sentence should describe a common scene in daily life, and the concepts should be used in a natural way.
Concepts: "{$concept_list}"
Model A: "{$candidate_A}"
Model B: "{$candidate_B}"
# Your Task
Your task is to choose a better sentence from the two candidates. Decide which model's sentence is better in terms of the naturalness and commonness of the scenes they describe.
## Rules:
- A better sentence should describe a common scene in daily life, and all concepts should be used in a natural way.
- You should prefer sentences that use all given concepts with correct part-of-speech tags.
- A simpler and shorter sentence is preferred if it describes the same scene as the other sentence.
- If you think both sentences are equally good or bad, please choose "tie".
Now, please output your choice ("A" or "B" or "tie").
Your choice:
```
</details>
## Case studies
Here are some examples of the generated sentences from the models.
<details>
<summary> Example 1 </summary>
- **Concepts**: `['demonstrate_V', 'machine_N', 'sew_V', 'use_V']`
- **Human Reference**: `She used the sewing machine to demonstrate how well she could sew.`
- **Mixtral-8x7B-Instruct-v0.1**: ` During the fashion design class, the teacher demonstrates how to use the sewing machine to create a stylish outfit.`
--> ```Versus=lose; Cover=False; PoS=False; Len=20```
- **Yi-34b-chat**: `The tailor demonstrates how to use the sewing machine.`
--> ```Versus=win; Cover=False; PoS=False; Len=9```
- **Yi-6b-chat**: `The machine demonstrates how to sew a useful item.`
--> ```Versus=lose; Cover=False; PoS=False; Len=9```
- **Llama-2-7b-chat-hf**: `The teacher demonstrates how to use the sewing machine to sew a new dress, which the student will use to create a beautiful outfit for the school play.`
--> ```Versus=lose; Cover=True; PoS=True; Len=28```
- **zephyr-7b-beta**: `The tailor demonstrates how to use the sewing machine to create garments by allowing the customer to try it out themselves.`
--> ```Versus=lose; Cover=False; PoS=False; Len=21```
- **tulu-2-dpo-70b**: `A woman demonstrates how to use a machine to sew, showing the audience step by step.`
--> ```Versus=lose; Cover=True; PoS=True; Len=16```
- **vicuna-13b-v1.5**: ` The tailor demonstrates how to use the sewing machine to make a dress.`
--> ```Versus=lose; Cover=False; PoS=False; Len=14```
- **gpt-3.5-turbo**: `The tailor uses the sewing machine to demonstrate how to sew a straight line.`
--> ```Versus=lose; Cover=True; PoS=True; Len=14```
- **gpt-4-0613**: `The instructor demonstrates how to use the sewing machine by sewing a piece of fabric.`
--> ```Versus=lose; Cover=True; PoS=True; Len=15```
- **gpt-4-1106-preview**: `The tailor demonstrates how to use the sewing machine by making a small pouch.`
--> ```Versus=win; Cover=False; PoS=False; Len=14```
</details>
<details>
<summary> Example 2 </summary>
- **Concepts**: `['car_N', 'drive_V', 'road_N', 'water_N']`
- **Human Reference**: `A man drives the car slowly because there is water on the road`
- **Mixtral-8x7B-Instruct-v0.1**: ` During the road trip, the family takes breaks to drink water next to the car driving on the open road.`
--> ```Versus=lose; Cover=True; PoS=True; Len=21```
- **Yi-34b-chat**: `The car drives down the road splashing water from the puddles.`
--> ```Versus=win; Cover=True; PoS=True; Len=11```
- **Yi-6b-chat**: `The car drives along the road, avoiding the puddle of water.`
--> ```Versus=win; Cover=True; PoS=True; Len=11```
- **Llama-2-7b-chat-hf**: `The driver drives the car along the road, feeling the water splashing against the tires.`
--> ```Versus=lose; Cover=True; PoS=True; Len=15```
- **zephyr-7b-beta**: `The driver takes the car along the road until they reach a place with water, where they cannot drive anymore.`
--> ```Versus=lose; Cover=True; PoS=True; Len=20```
- **tulu-2-dpo-70b**: `A person drives their car on the road next to a river, where the water is flowing smoothly.`
--> ```Versus=lose; Cover=True; PoS=True; Len=18```
- **vicuna-13b-v1.5**: ` The car drives on the road by the water.`
--> ```Versus=lose; Cover=True; PoS=True; Len=10```
- **gpt-3.5-turbo**: `I drive my car on the road next to the water.`
--> ```Versus=win; Cover=True; PoS=True; Len=11```
- **gpt-4-0613**: `The man drives his car on the road that runs alongside the sparkling water.`
--> ```Versus=lose; Cover=True; PoS=True; Len=14```
- **gpt-4-1106-preview**: `The car drives smoothly on the wet road, splashing water as it goes.`
--> ```Versus=win; Cover=True; PoS=True; Len=13```
</details>
<details>
<summary> Example 3 </summary>
- **Concepts**: `['ceiling_N', 'climb_V', 'gym_N', 'rope_N', 'suspend_V']`
- **Human Reference**: `The boys climbed the rope suspended from the gym ceiling.`
- **Mixtral-8x7B-Instruct-v0.1**: ` During gym class, the students take turns climbing the suspended rope while the others cheer from below, careful not to hit their heads on the ceiling.`
--> ```Versus=lose; Cover=True; PoS=True; Len=27```
- **Yi-34b-chat**: `The gymnast climbs the rope and suspends from the ceiling of the gym.`
--> ```Versus=lose; Cover=True; PoS=True; Len=13```
- **Yi-6b-chat**: `The climber suspends themselves on the rope as they climb the gym's ceiling.`
--> ```Versus=lose; Cover=True; PoS=True; Len=13```
- **Llama-2-7b-chat-hf**: `The gym teacher suspends herself from the ceiling using a rope while climbing to reach the high bar for a workout.`
--> ```Versus=lose; Cover=True; PoS=True; Len=21```
- **zephyr-7b-beta**: `Gymnasts suspend themselves from the ceiling by climbing on the ropes in the gym.`
--> ```Versus=lose; Cover=True; PoS=True; Len=14```
- **tulu-2-dpo-70b**: `In the gym, an athlete climbs a rope suspended from the ceiling to improve their physical fitness.`
--> ```Versus=lose; Cover=True; PoS=True; Len=17```
- **vicuna-13b-v1.5**: ` In the gym, she climbs up to the ceiling using a rope that is suspended from the top.`
--> ```Versus=lose; Cover=True; PoS=True; Len=19```
- **gpt-3.5-turbo**: `In the gym, the athlete climbs the rope and suspends himself from the ceiling.`
--> ```Versus=lose; Cover=True; PoS=True; Len=14```
- **gpt-4-0613**: `In the gym, a man climbs a rope that is suspended from the ceiling.`
--> ```Versus=lose; Cover=True; PoS=True; Len=14```
- **gpt-4-1106-preview**: `In the gym, the rope is suspended from the ceiling for athletes to climb during their workout.`
--> ```Versus=lose; Cover=True; PoS=True; Len=17```
</details>
<details>
<summary> Example 4 </summary>
- **Concepts**: `['hat_N', 'knit_V', 'pile_N', 'rock_N', 'sit_V']`
- **Human Reference**: `The woman sat on a pile of rocks and knitted a hat.`
- **Mixtral-8x7B-Instruct-v0.1**: ` During her break, a woman sits on a rock and starts to knit a pile of hats to sell at the local market.`
--> ```Versus=lose; Cover=True; PoS=True; Len=24```
- **Yi-34b-chat**: `The knitted hat sat on a pile of rocks.`
--> ```Versus=lose; Cover=False; PoS=False; Len=9```
- **Yi-6b-chat**: `The woman knits a cozy hat while sitting on a rock, placing the pile of rocks around her.`
--> ```Versus=lose; Cover=True; PoS=True; Len=18```
- **Llama-2-7b-chat-hf**: `The old man sits on a rock and knits a new hat while piling up the ones he already has, enjoying the peaceful atmosphere.`
--> ```Versus=lose; Cover=True; PoS=False; Len=24```
- **zephyr-7b-beta**: `A woman sits on a rock and knits a hat from a pile of yarn.`
--> ```Versus=lose; Cover=True; PoS=True; Len=15```
- **tulu-2-dpo-70b**: `An elderly woman knits a colorful hat while sitting on a pile of rocks, enjoying the peaceful scenery around her.`
--> ```Versus=lose; Cover=True; PoS=True; Len=20```
- **vicuna-13b-v1.5**: ` The old man sits on a rock and knits a hat in a pile of yarn.`
--> ```Versus=lose; Cover=True; PoS=True; Len=17```
- **gpt-3.5-turbo**: `The old man sits on a rock, wearing a knitted hat, while a pile of leaves gathers around him.`
--> ```Versus=lose; Cover=False; PoS=False; Len=19```
- **gpt-4-0613**: `The woman sits on a rock, knitting a hat, with a pile of yarn beside her.`
--> ```Versus=lose; Cover=True; PoS=True; Len=16```
- **gpt-4-1106-preview**: `She sits on a rock, knitting a hat, and places the finished ones in a colorful pile beside her.`
--> ```Versus=lose; Cover=True; PoS=True; Len=19```
</details>
## Links
- Contact: [Bill Yuchen Lin](https://yuchenlin.xyz/)
- Project website: [https://inklab.usc.edu/CommonGen/](https://inklab.usc.edu/CommonGen/)
- HuggingFace Dataset: [CommonGen-lite](https://huggingface.co/datasets/allenai/commongen_lite)
## Citation
```bibtex
@inproceedings{lin-etal-2020-commongen,
title = "{C}ommon{G}en: A Constrained Text Generation Challenge for Generative Commonsense Reasoning",
author = "Lin, Bill Yuchen and
Zhou, Wangchunshu and
Shen, Ming and
Zhou, Pei and
Bhagavatula, Chandra and
Choi, Yejin and
Ren, Xiang",
booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2020",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.findings-emnlp.165",
pages = "1823--1840",
}
```
|
TheGreatRambler/mm2_user_liked | ---
language:
- multilingual
license:
- cc-by-nc-sa-4.0
multilinguality:
- multilingual
size_categories:
- 100M<n<1B
source_datasets:
- original
task_categories:
- other
- object-detection
- text-retrieval
- token-classification
- text-generation
task_ids: []
pretty_name: Mario Maker 2 user likes
tags:
- text-mining
---
# Mario Maker 2 user likes
Part of the [Mario Maker 2 Dataset Collection](https://tgrcode.com/posts/mario_maker_2_datasets)
## Dataset Description
The Mario Maker 2 user likes dataset consists of 105.5 million user likes from Nintendo's online service totaling around 630MB of data. The dataset was created using the self-hosted [Mario Maker 2 api](https://tgrcode.com/posts/mario_maker_2_api) over the course of 1 month in February 2022.
### How to use it
The Mario Maker 2 user likes dataset is a very large dataset so for most use cases it is recommended to make use of the streaming API of `datasets`. You can load and iterate through the dataset with the following code:
```python
from datasets import load_dataset
ds = load_dataset("TheGreatRambler/mm2_user_liked", streaming=True, split="train")
print(next(iter(ds)))
#OUTPUT:
{
'pid': '14510618610706594411',
'data_id': 25861713
}
```
Each row is a unique like in the level denoted by the `data_id` done by the player denoted by the `pid`.
You can also download the full dataset. Note that this will download ~630MB:
```python
ds = load_dataset("TheGreatRambler/mm2_user_liked", split="train")
```
## Data Structure
### Data Instances
```python
{
'pid': '14510618610706594411',
'data_id': 25861713
}
```
### Data Fields
|Field|Type|Description|
|---|---|---|
|pid|string|The player ID of this user, an unsigned 64 bit integer as a string|
|data_id|int|The data ID of the level this user liked|
### Data Splits
The dataset only contains a train split.
<!-- TODO create detailed statistics -->
## Dataset Creation
The dataset was created over a little more than a month in Febuary 2022 using the self hosted [Mario Maker 2 api](https://tgrcode.com/posts/mario_maker_2_api). As requests made to Nintendo's servers require authentication the process had to be done with upmost care and limiting download speed as to not overload the API and risk a ban. There are no intentions to create an updated release of this dataset.
## Considerations for Using the Data
The dataset contains no harmful language or depictions.
|
dongyoung4091/hh-generated_flan_t5_large_flan_t5_zeroshot_DA_Bard_Generated | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: zeroshot_relevance
dtype: float64
- name: zeroshot_completeness
dtype: float64
- name: zeroshot_clarity
dtype: float64
- name: zeroshot_accuracy
dtype: float64
- name: zeroshot_objectivity
dtype: float64
- name: zeroshot_creativity
dtype: float64
- name: zeroshot_engagement
dtype: float64
- name: zeroshot_personalization
dtype: float64
- name: zeroshot_politeness
dtype: float64
- name: zeroshot_efficiency
dtype: float64
- name: zeroshot_timeliness
dtype: float64
splits:
- name: train
num_bytes: 5926757
num_examples: 25600
download_size: 765908
dataset_size: 5926757
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "hh-generated_flan_t5_large_flan_t5_zeroshot_DA_Bard_Generated"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ai2lumos/lumos_multimodal_ground_iterative | ---
license: apache-2.0
task_categories:
- text-generation
- question-answering
language:
- en
tags:
- language-agent
- reasoning
- visual-question-answering
- grounding
size_categories:
- 10K<n<100K
---
# 🪄 Agent Lumos: Unified and Modular Training for Open-Source Language Agents
<p align="center">
🌐<a href="https://allenai.github.io/lumos">[Website]</a>
📝<a href="https://arxiv.org/abs/2311.05657">[Paper]</a>
🤗<a href="https://huggingface.co/datasets?sort=trending&search=ai2lumos">[Data]</a>
🤗<a href="https://huggingface.co/models?sort=trending&search=ai2lumos">[Model]</a>
🤗<a href="https://huggingface.co/spaces/ai2lumos/lumos_data_demo">[Demo]</a>
</p>
We introduce 🪄**Lumos**, Language Agents with **Unified** Formats, **Modular** Design, and **Open-Source** LLMs. **Lumos** unifies a suite of complex interactive tasks and achieves competitive performance with GPT-4/3.5-based and larger open-source agents.
**Lumos** has following features:
* 🧩 **Modular Architecture**:
- 🧩 **Lumos** consists of planning, grounding, and execution modules built based on LLAMA-2-7B/13B and off-the-shelf APIs.
- 🤗 **Lumos** utilizes a unified data format that encompasses multiple task types, thereby enabling the developed agent framework to conveniently support a range of interactive tasks.
* 🌍 **Diverse Training Data**:
- 🌍 **Lumos** is trained with ~56K diverse high-quality subgoal/action annotations from ground-truth reasoning steps in existing benchmarks with GPT-4.
- ⚒️ **Lumos** data can be instrumental for future research in developing open-source agents for complex interactive tasks.
* 🚀 **Competitive Performance**:
- 🚀 **Lumos** is comparable or even beats **GPT-series** agents on web/complex QA tasks Mind2Web and HotpotQA, and **larger open agents** on math and multimodal tasks.
- 🚀 **Lumos** exceeds contemporaneous agents that have been **fine-tuned** with in-domain HotpotQA, Mind2Web and ScienceQA annotations, such as **FiReAct**, **AgentLM**, and **AutoAct**.
- 🚀 **Lumos** performs better than open agent baseline formulations including **chain-of-thoughts** and **integrated** training.
- 🚀 **Lumos** surpasses larger open LLM agents and domain-specific agents on unseen tasks, WebShop and InterCode_SQL.
## Data Overview
`lumos_multimodal_ground_iterative` is the data for training **grounding** module on **multimodal** task in **Lumos-Iterative (Lumos-I)** formulation.
The source of the training annotation training data is shown below:
| Datasets | Number |
|---|---|
|A-OKVQA|15941|
## Models Trained with the Data
`lumos_multimodal_ground_iterative` is used to train the following models.
|Model|Huggingface Repo|
|---|---|
|`lumos_multimodal_ground_iterative`| [🤗Huggingface Repo](https://huggingface.co/ai2lumos/lumos_multimodal_ground_iterative) |
|`lumos_multimodal_ground_iterative-13B`| [🤗Huggingface Repo](https://huggingface.co/ai2lumos/lumos_multimodal_ground_iterative-13B) |
|`lumos_unified_ground_iterative`| [🤗Huggingface Repo](https://huggingface.co/ai2lumos/lumos_unified_ground_iterative) |
|`lumos_unified_ground_iterative-13B`| [🤗Huggingface Repo](https://huggingface.co/ai2lumos/lumos_unified_ground_iterative-13B) |
## Citation
If you find this work is relevant with your research, please feel free to cite our work!
```
@article{yin2023lumos,
title={Agent Lumos: Unified and Modular Training for Open-Source Language Agents},
author={Yin, Da and Brahman, Faeze and Ravichander, Abhilasha and Chandu, Khyathi and Chang, Kai-Wei and Choi, Yejin and Lin, Bill Yuchen},
journal={arXiv preprint arXiv:2311.05657},
year={2023}
}
``` |
cpryan/288-demo | ---
license: pddl
---
|
reciprocate/gsm8k_train_pairwise | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: selected
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 5031130
num_examples: 7043
download_size: 2891682
dataset_size: 5031130
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gsm8k_train_pairwise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_99_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 4631811
num_examples: 14028
download_size: 1757658
dataset_size: 4631811
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_99_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joemerson/coringa | ---
license: openrail
---
|
erishabh/unit-test-v2 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 2289947144
num_examples: 215409
download_size: 239191939
dataset_size: 2289947144
---
# Dataset Card for "unit-test-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yiweifu/relearn_retain | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 2064131.0
num_examples: 6
download_size: 2066131
dataset_size: 2064131.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
existence-master/bloomify-classification-0.6k-simple | ---
license: gpl-3.0
---
|
Trelis/openassistant-falcon | ---
license: apache-2.0
language:
- en
- es
- ru
- de
- pl
- th
- vi
- sv
- bn
- da
- he
- it
- fa
- sk
- id
- nb
- el
- nl
- hu
- eu
- zh
- eo
- ja
- ca
- cs
- bg
- fi
- pt
- tr
- ro
- ar
- uk
- gl
- fr
- ko
tags:
- human-feedback
- llama-2
size_categories:
- 1K<n<10k
pretty_name: Filtered OpenAssistant Conversations
---
# Chat Fine-tuning Dataset - OpenAssistant Falcon
This dataset allows for fine-tuning chat models using '\Human:' AND '\nAssistant:' to wrap user messages.
It still uses <|endoftext|> as EOS and BOS token, as per Falcon.
Sample
Preparation:
1. The dataset is cloned from [TimDettmers](https://huggingface.co/datasets/timdettmers/openassistant-guanaco), which itself is a subset of the Open Assistant dataset, which you can find [here](https://huggingface.co/datasets/OpenAssistant/oasst1/tree/main). This subset of the data only contains the highest-rated paths in the conversation tree, with a total of 9,846 samples.
1. The dataset was then filtered to:
- replace instances of '### Human:' with '\nHuman:'
- replace instances of '### Assistant:' with '\nAssistant:'
- end assistant responses with <|endoftext|> (to encourage the model to emit <|endoftext|> when finished a response).
Details of the root dataset follow, copied from that repo:
# OpenAssistant Conversations Dataset (OASST1)
## Dataset Description
- **Homepage:** https://www.open-assistant.io/
- **Repository:** https://github.com/LAION-AI/Open-Assistant
- **Paper:** https://arxiv.org/abs/2304.07327
### Dataset Summary
In an effort to democratize research on large-scale alignment, we release OpenAssistant
Conversations (OASST1), a human-generated, human-annotated assistant-style conversation
corpus consisting of 161,443 messages in 35 different languages, annotated with 461,292
quality ratings, resulting in over 10,000 fully annotated conversation trees. The corpus
is a product of a worldwide crowd-sourcing effort involving over 13,500 volunteers.
Please refer to our [paper](https://arxiv.org/abs/2304.07327) for further details.
### Dataset Structure
This dataset contains message trees. Each message tree has an initial prompt message as the root node,
which can have multiple child messages as replies, and these child messages can have multiple replies.
All messages have a role property: this can either be "assistant" or "prompter". The roles in
conversation threads from prompt to leaf node strictly alternate between "prompter" and "assistant".
This version of the dataset contains data collected on the [open-assistant.io](https://open-assistant.io/) website until April 12 2023.
### JSON Example: Message
For readability, the following JSON examples are shown formatted with indentation on multiple lines.
Objects are stored without indentation (on single lines) in the actual jsonl files.
```json
{
"message_id": "218440fd-5317-4355-91dc-d001416df62b",
"parent_id": "13592dfb-a6f9-4748-a92c-32b34e239bb4",
"user_id": "8e95461f-5e94-4d8b-a2fb-d4717ce973e4",
"text": "It was the winter of 2035, and artificial intelligence (..)",
"role": "assistant",
"lang": "en",
"review_count": 3,
"review_result": true,
"deleted": false,
"rank": 0,
"synthetic": true,
"model_name": "oasst-sft-0_3000,max_new_tokens=400 (..)",
"labels": {
"spam": { "value": 0.0, "count": 3 },
"lang_mismatch": { "value": 0.0, "count": 3 },
"pii": { "value": 0.0, "count": 3 },
"not_appropriate": { "value": 0.0, "count": 3 },
"hate_speech": { "value": 0.0, "count": 3 },
"sexual_content": { "value": 0.0, "count": 3 },
"quality": { "value": 0.416, "count": 3 },
"toxicity": { "value": 0.16, "count": 3 },
"humor": { "value": 0.0, "count": 3 },
"creativity": { "value": 0.33, "count": 3 },
"violence": { "value": 0.16, "count": 3 }
}
}
```
### JSON Example: Conversation Tree
For readability, only a subset of the message properties is shown here.
```json
{
"message_tree_id": "14fbb664-a620-45ce-bee4-7c519b16a793",
"tree_state": "ready_for_export",
"prompt": {
"message_id": "14fbb664-a620-45ce-bee4-7c519b16a793",
"text": "Why can't we divide by 0? (..)",
"role": "prompter",
"lang": "en",
"replies": [
{
"message_id": "894d30b6-56b4-4605-a504-89dd15d4d1c8",
"text": "The reason we cannot divide by zero is because (..)",
"role": "assistant",
"lang": "en",
"replies": [
// ...
]
},
{
"message_id": "84d0913b-0fd9-4508-8ef5-205626a7039d",
"text": "The reason that the result of a division by zero is (..)",
"role": "assistant",
"lang": "en",
"replies": [
{
"message_id": "3352725e-f424-4e3b-a627-b6db831bdbaa",
"text": "Math is confusing. Like those weird Irrational (..)",
"role": "prompter",
"lang": "en",
"replies": [
{
"message_id": "f46207ca-3149-46e9-a466-9163d4ce499c",
"text": "Irrational numbers are simply numbers (..)",
"role": "assistant",
"lang": "en",
"replies": []
},
// ...
]
}
]
}
]
}
}
```
Please refer to [oasst-data](https://github.com/LAION-AI/Open-Assistant/tree/main/oasst-data) for
details about the data structure and Python code to read and write jsonl files containing oasst data objects.
If you would like to explore the dataset yourself you can find a
[`getting-started`](https://github.com/LAION-AI/Open-Assistant/blob/main/notebooks/openassistant-oasst1/getting-started.ipynb)
notebook in the `notebooks/openassistant-oasst1` folder of the [LAION-AI/Open-Assistant](https://github.com/LAION-AI/Open-Assistant)
github repository.
## Main Dataset Files
Conversation data is provided either as nested messages in trees (extension `.trees.jsonl.gz`)
or as a flat list (table) of messages (extension `.messages.jsonl.gz`).
### Ready For Export Trees
```
2023-04-12_oasst_ready.trees.jsonl.gz 10,364 trees with 88,838 total messages
2023-04-12_oasst_ready.messages.jsonl.gz 88,838 messages
```
Trees in `ready_for_export` state without spam and deleted messages including message labels.
The oasst_ready-trees file usually is sufficient for supervised fine-tuning (SFT) & reward model (RM) training.
### All Trees
```
2023-04-12_oasst_all.trees.jsonl.gz 66,497 trees with 161,443 total messages
2023-04-12_oasst_all.messages.jsonl.gz 161,443 messages
```
All trees, including those in states `prompt_lottery_waiting` (trees that consist of only one message, namely the initial prompt),
`aborted_low_grade` (trees that stopped growing because the messages had low quality), and `halted_by_moderator`.
### Supplemental Exports: Spam & Prompts
```
2023-04-12_oasst_spam.messages.jsonl.gz
```
These are messages which were deleted or have a negative review result (`"review_result": false`).
Besides low quality, a frequent reason for message deletion is a wrong language tag.
```
2023-04-12_oasst_prompts.messages.jsonl.gz
```
These are all the kept initial prompt messages with positive review result (no spam) of trees in `ready_for_export` or `prompt_lottery_waiting` state.
### Using the Huggingface Datasets
While HF datasets is ideal for tabular datasets, it is not a natural fit for nested data structures like the OpenAssistant conversation trees.
Nevertheless, we make all messages which can also be found in the file `2023-04-12_oasst_ready.trees.jsonl.gz` available in parquet as train/validation splits.
These are directly loadable by [Huggingface Datasets](https://pypi.org/project/datasets/).
To load the oasst1 train & validation splits use:
```python
from datasets import load_dataset
ds = load_dataset("OpenAssistant/oasst1")
train = ds['train'] # len(train)=84437 (95%)
val = ds['validation'] # len(val)=4401 (5%)
```
The messages appear in depth-first order of the message trees.
Full conversation trees can be reconstructed from the flat messages table by using the `parent_id`
and `message_id` properties to identify the parent-child relationship of messages. The `message_tree_id`
and `tree_state` properties (only present in flat messages files) can be used to find all messages of a message tree or to select trees by their state.
### Languages
OpenAssistant Conversations incorporates 35 different languages with a distribution of messages as follows:
**Languages with over 1000 messages**
- English: 71956
- Spanish: 43061
- Russian: 9089
- German: 5279
- Chinese: 4962
- French: 4251
- Thai: 3042
- Portuguese (Brazil): 2969
- Catalan: 2260
- Korean: 1553
- Ukrainian: 1352
- Italian: 1320
- Japanese: 1018
<details>
<summary><b>Languages with under 1000 messages</b></summary>
<ul>
<li>Vietnamese: 952</li>
<li>Basque: 947</li>
<li>Polish: 886</li>
<li>Hungarian: 811</li>
<li>Arabic: 666</li>
<li>Dutch: 628</li>
<li>Swedish: 512</li>
<li>Turkish: 454</li>
<li>Finnish: 386</li>
<li>Czech: 372</li>
<li>Danish: 358</li>
<li>Galician: 339</li>
<li>Hebrew: 255</li>
<li>Romanian: 200</li>
<li>Norwegian Bokmål: 133</li>
<li>Indonesian: 115</li>
<li>Bulgarian: 95</li>
<li>Bengali: 82</li>
<li>Persian: 72</li>
<li>Greek: 66</li>
<li>Esperanto: 59</li>
<li>Slovak: 19</li>
</ul>
</details>
## Contact
- Discord [Open Assistant Discord Server](https://ykilcher.com/open-assistant-discord)
- GitHub: [LAION-AI/Open-Assistant](https://github.com/LAION-AI/Open-Assistant)
- E-Mail: [open-assistant@laion.ai](mailto:open-assistant@laion.ai) |
symeneses/merlin | ---
license: cc-by-sa-4.0
task_categories:
- text-classification
language:
- de
- it
- cs
pretty_name: MERLIN Written Learner Corpus for Czech, German, Italian 1.1.
size_categories:
- 1K<n<10K
---
# Dataset Card for MERLIN
The MERLIN corpus is a written learner corpus for Czech, German, and Italian that has been
designed to illustrate the Common European Framework of Reference for Languages (CEFR) with
authentic learner data. The corpus contains learner texts produced in standardized language
certifications covering CEFR levels A1-C1. The MERLIN annotation scheme includes a wide
range of language characteristics that provide researchers with concrete examples of learner
performance and progress across multiple proficiency levels.
## Dataset Details
### Dataset Description
The MERLIN corpus contains 2,286 texts for learners of Italian, German and Czech that were taken from written examinations of acknowledged test institutions. The exams aim to test knowledge across the levels A1-C1 of the Common European Framework of Reference (CEFR).
- **Homepage :** https://merlin-platform.eu/
- **Funded by :** The MERLIN project was funded from 2012 until 2014 by the EU Lifelong Learning Programme under project number 518989-LLP-1-2011-1-DE-KA2-KA2MP.
- **Shared by :** Since 2018, corpus data are available through the CLARIN network.
- **Language(s) (NLP):** Czech, German and Italian
- **License:** Creative Commons - Attribution-ShareAlike 4.0 International (CC BY-SA 4.0)
### Dataset Sources
- **Data PID:** https://hdl.handle.net/20.500.12124/6
- **Verion controlled data (Git):** https://gitlab.inf.unibz.it/commul/merlin-platform/data-bundle
- **Paper:** Boyd, A., Hana, J., Nicolas, L., Meurers, D., Wisniewski, K., Abel, A., Schöne, K., Štindlová, B., & Vettori, C. (2014). The MERLIN corpus: Learner language and the CEFR. Proceedings of the 9th International Conference on Language Resources and Evaluation (LREC 14), 26-31 May 2014, 1281–1288.
http://www.lrec-conf.org/proceedings/lrec2014/summaries/606.html.
## Uses
- Teachers and material writers
- Curriculum design and course planning
- Language testing
For more details and practicla examples, see [use cases](https://www.merlin-platform.eu/C_teacher.php).
## Citation
**BibTeX:**
@misc{20.500.12124/6,
title = {{MERLIN} Written Learner Corpus for Czech, German, Italian 1.1},
author = {Wisniewski, Katrin and Abel, Andrea and Vodi{\v c}kov{\'a}, Kate{\v r}ina and Plassmann, Sybille and Meurers, Detmar and Woldt, Claudia and Sch{\"o}ne, Karin and Blaschitz, Verena and Lyding, Verena and Nicolas, Lionel and Vettori, Chiara and Pe{\v c}en{\'y}, Pavel and Hana, Jirka and {\v C}urdov{\'a}, Veronika and {\v S}tindlov{\'a}, Barbora and Klein, Gudrun and Lauppe, Louise and Boyd, Adriane and Bykh, Serhiy and Krivanek, Julia},
url = {http://hdl.handle.net/20.500.12124/6},
note = {Eurac Research {CLARIN} Centre},
copyright = {Creative Commons - Attribution-{ShareAlike} 4.0 International ({CC} {BY}-{SA} 4.0)},
year = {2018} }
|
freshpearYoon/v3_train_free_concat_33 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 3842514224
num_examples: 2500
download_size: 1740012680
dataset_size: 3842514224
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zyznull/msmarco-passage-corpus | ---
license: mit
---
|
open-llm-leaderboard/details_jikaixuan__test | ---
pretty_name: Evaluation run of jikaixuan/test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jikaixuan/test](https://huggingface.co/jikaixuan/test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jikaixuan__test\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T00:21:29.315161](https://huggingface.co/datasets/open-llm-leaderboard/details_jikaixuan__test/blob/main/results_2023-12-30T00-21-29.315161.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6083156130977855,\n\
\ \"acc_stderr\": 0.03325697047846345,\n \"acc_norm\": 0.6149057049790325,\n\
\ \"acc_norm_stderr\": 0.03395674600613233,\n \"mc1\": 0.40758873929008566,\n\
\ \"mc1_stderr\": 0.01720194923455311,\n \"mc2\": 0.575115982039762,\n\
\ \"mc2_stderr\": 0.015744867615337492\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.01435639941800912,\n\
\ \"acc_norm\": 0.6228668941979523,\n \"acc_norm_stderr\": 0.014163366896192594\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6490738896634136,\n\
\ \"acc_stderr\": 0.004762844770909858,\n \"acc_norm\": 0.8441545508862777,\n\
\ \"acc_norm_stderr\": 0.003619674864035017\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n\
\ \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.032662042990646775,\n\
\ \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.032662042990646775\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159795,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159795\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n\
\ \"acc_stderr\": 0.024685979286239963,\n \"acc_norm\": 0.7483870967741936,\n\
\ \"acc_norm_stderr\": 0.024685979286239963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.02749350424454805,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.02749350424454805\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.02460362692409742,\n \
\ \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.02460362692409742\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8055045871559633,\n \"acc_stderr\": 0.01697028909045803,\n \"\
acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.01697028909045803\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.033922384053216174,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.033922384053216174\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588674,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588674\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965835,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965835\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928276,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928276\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n\
\ \"acc_stderr\": 0.01492744710193715,\n \"acc_norm\": 0.7752234993614304,\n\
\ \"acc_norm_stderr\": 0.01492744710193715\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.025248264774242826,\n\
\ \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.025248264774242826\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33854748603351953,\n\
\ \"acc_stderr\": 0.01582670009648135,\n \"acc_norm\": 0.33854748603351953,\n\
\ \"acc_norm_stderr\": 0.01582670009648135\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.026643278474508755,\n\
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.026643278474508755\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.02666441088693762,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.02666441088693762\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02622964917882117,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02622964917882117\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4315514993481095,\n\
\ \"acc_stderr\": 0.01265000799946388,\n \"acc_norm\": 0.4315514993481095,\n\
\ \"acc_norm_stderr\": 0.01265000799946388\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.02909720956841195,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.02909720956841195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6176470588235294,\n \"acc_stderr\": 0.019659922493623343,\n \
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.019659922493623343\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.029923100563683906,\n\
\ \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.029923100563683906\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40758873929008566,\n\
\ \"mc1_stderr\": 0.01720194923455311,\n \"mc2\": 0.575115982039762,\n\
\ \"mc2_stderr\": 0.015744867615337492\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2721758908263836,\n \
\ \"acc_stderr\": 0.012259714035164548\n }\n}\n```"
repo_url: https://huggingface.co/jikaixuan/test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|arc:challenge|25_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|gsm8k|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hellaswag|10_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T00-21-29.315161.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T00-21-29.315161.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- '**/details_harness|winogrande|5_2023-12-30T00-21-29.315161.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T00-21-29.315161.parquet'
- config_name: results
data_files:
- split: 2023_12_30T00_21_29.315161
path:
- results_2023-12-30T00-21-29.315161.parquet
- split: latest
path:
- results_2023-12-30T00-21-29.315161.parquet
---
# Dataset Card for Evaluation run of jikaixuan/test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jikaixuan/test](https://huggingface.co/jikaixuan/test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jikaixuan__test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T00:21:29.315161](https://huggingface.co/datasets/open-llm-leaderboard/details_jikaixuan__test/blob/main/results_2023-12-30T00-21-29.315161.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6083156130977855,
"acc_stderr": 0.03325697047846345,
"acc_norm": 0.6149057049790325,
"acc_norm_stderr": 0.03395674600613233,
"mc1": 0.40758873929008566,
"mc1_stderr": 0.01720194923455311,
"mc2": 0.575115982039762,
"mc2_stderr": 0.015744867615337492
},
"harness|arc:challenge|25": {
"acc": 0.5930034129692833,
"acc_stderr": 0.01435639941800912,
"acc_norm": 0.6228668941979523,
"acc_norm_stderr": 0.014163366896192594
},
"harness|hellaswag|10": {
"acc": 0.6490738896634136,
"acc_stderr": 0.004762844770909858,
"acc_norm": 0.8441545508862777,
"acc_norm_stderr": 0.003619674864035017
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.032662042990646775,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.032662042990646775
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159795,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159795
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239963,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.02749350424454805,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.02749350424454805
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.02460362692409742,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.02460362692409742
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8055045871559633,
"acc_stderr": 0.01697028909045803,
"acc_norm": 0.8055045871559633,
"acc_norm_stderr": 0.01697028909045803
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.033922384053216174,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.033922384053216174
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588674,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588674
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965835,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965835
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928276,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7752234993614304,
"acc_stderr": 0.01492744710193715,
"acc_norm": 0.7752234993614304,
"acc_norm_stderr": 0.01492744710193715
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.025248264774242826,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.025248264774242826
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33854748603351953,
"acc_stderr": 0.01582670009648135,
"acc_norm": 0.33854748603351953,
"acc_norm_stderr": 0.01582670009648135
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.026643278474508755,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.026643278474508755
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.02666441088693762,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.02666441088693762
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02622964917882117,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02622964917882117
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4315514993481095,
"acc_stderr": 0.01265000799946388,
"acc_norm": 0.4315514993481095,
"acc_norm_stderr": 0.01265000799946388
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.02909720956841195,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.02909720956841195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.019659922493623343,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.019659922493623343
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.029923100563683906,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.029923100563683906
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40758873929008566,
"mc1_stderr": 0.01720194923455311,
"mc2": 0.575115982039762,
"mc2_stderr": 0.015744867615337492
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
},
"harness|gsm8k|5": {
"acc": 0.2721758908263836,
"acc_stderr": 0.012259714035164548
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
DebasishDhal99/german-czech-paired-placenames | ---
license: mit
task_categories:
- translation
language:
- de
- cs
tags:
- history
size_categories:
- 1K<n<10K
---
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains the German and corresponding Czech names for almost 5k places in Czech Republic. It has been generated using [this code](https://github.com/DebasishDhal/Minor-Stuff/blob/main/paired-placenames-scrapping/german-czech.py).
Many of these names are related to each other. Some German names are literal translation of the Czech names (or maybe the other way around), some are phonetic modifications while some are unrelated.
## Dataset Creation
### Source Data
[English wiki page containing German exonyms for places in Czech Republic](https://en.wikipedia.org/wiki/List_of_German_names_for_places_in_the_Czech_Republic) |
Cartinoe5930/qwen_gsm8k | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: extracted_answer
dtype: int64
- name: Qwen1.5-1.8B_original_answer
dtype: string
- name: Qwen1.5-1.8B_processed_answer
dtype: int64
- name: Qwen1.5-1.8B_correctness
dtype: int64
- name: Qwen1.5-7B_original_answer
dtype: string
- name: Qwen1.5-7B_processed_answer
dtype: int64
- name: Qwen1.5-7B_correctness
dtype: int64
splits:
- name: train
num_bytes: 1688475
num_examples: 1319
download_size: 897383
dataset_size: 1688475
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
anan-2024/twitter_dataset_1713035805 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 25342
num_examples: 58
download_size: 13715
dataset_size: 25342
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
version-control/arrayblow-1.0-test | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: hexsha
dtype: string
- name: code
dtype: string
- name: file_path
dtype: string
- name: api_extract
dtype: string
splits:
- name: train
num_bytes: 2398323
num_examples: 213
download_size: 852094
dataset_size: 2398323
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Ngadou/social-engineering-convo | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
- fr
pretty_name: Social Engineering Conversation modelling
size_categories:
- n<1K
---
Social Engineering Conversation modelling
## Rational
LLM are few shot learners |
mideind/icelandic-inflection-easy | ---
license: cc-by-4.0
---
|
bstds/geonames | ---
dataset_info:
features:
- name: geonameid
dtype: int64
- name: name
dtype: string
- name: asciiname
dtype: string
- name: alternatenames
dtype: string
- name: latitude
dtype: float64
- name: longitude
dtype: float64
- name: feature_class
dtype: string
- name: feature_code
dtype: string
- name: country_code
dtype: string
- name: cc2
dtype: string
- name: admin1_code
dtype: string
- name: admin2_code
dtype: string
- name: admin3_code
dtype: string
- name: admin4_code
dtype: string
- name: population
dtype: int64
- name: elevation
dtype: float64
- name: dem
dtype: int64
- name: timezone
dtype: string
- name: modification_date
dtype: string
splits:
- name: train
num_bytes: 2378719253
num_examples: 12368001
download_size: 824343833
dataset_size: 2378719253
---
# Dataset Card for "geonames"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
[Source](https://download.geonames.org/export/dump/)
```
"geonameid", # integer id of record in geonames database
"name", # name of geographical point (utf8) varchar(200)
"asciiname", # name of geographical point in plain ascii characters, varchar(200)
"alternatenames",
# alternatenames, comma separated, ascii names automatically transliterated, convenience attribute from alternatename table, varchar(10000)
"latitude", # latitude in decimal degrees (wgs84)
"longitude", # longitude in decimal degrees (wgs84)
"feature_class", # see http://www.geonames.org/export/codes.html, char(1)
"feature_code", # see http://www.geonames.org/export/codes.html, varchar(10)
"country_code", # ISO-3166 2-letter country code, 2 characters
"cc2",
# alternate country codes, comma separated, ISO-3166 2-letter country code, 200 characters
"admin1_code",
# fipscode (subject to change to iso code), see exceptions below, see file admin1Codes.txt for display names of this code; varchar(20)
"admin2_code",
# code for the second administrative division, a county in the US, see file admin2Codes.txt; varchar(80)
"admin3_code", # code for third level administrative division, varchar(20)
"admin4_code", # code for fourth level administrative division, varchar(20)
"population", # bigint (8 byte int)
"elevation", # in meters, integer
"dem",
# digital elevation model, srtm3 or gtopo30, average elevation of 3''x3'' (ca 90mx90m) or 30''x30'' (ca 900mx900m) area in meters, integer. srtm processed by cgiar/ciat.
"timezone", # the iana timezone id (see file timeZone.txt) varchar(40)
"modification_date", # date of last modification in yyyy-MM-dd format"
``` |
open-llm-leaderboard/details_flemmingmiguel__Mistrality-7B | ---
pretty_name: Evaluation run of flemmingmiguel/Mistrality-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [flemmingmiguel/Mistrality-7B](https://huggingface.co/flemmingmiguel/Mistrality-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_flemmingmiguel__Mistrality-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-11T10:13:19.328780](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__Mistrality-7B/blob/main/results_2024-01-11T10-13-19.328780.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6491508355371155,\n\
\ \"acc_stderr\": 0.031941087236083424,\n \"acc_norm\": 0.6501709046743914,\n\
\ \"acc_norm_stderr\": 0.03258488545513918,\n \"mc1\": 0.3929008567931457,\n\
\ \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.567957871171655,\n\
\ \"mc2_stderr\": 0.01545729191398638\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6279863481228669,\n \"acc_stderr\": 0.014124597881844461,\n\
\ \"acc_norm\": 0.6655290102389079,\n \"acc_norm_stderr\": 0.013787460322441374\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6734714200358495,\n\
\ \"acc_stderr\": 0.004679847503411342,\n \"acc_norm\": 0.858195578570006,\n\
\ \"acc_norm_stderr\": 0.00348136484077097\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.034765901043041336,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.034765901043041336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291932,\n\
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291932\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624734,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624734\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n\
\ \"acc_stderr\": 0.013468201614066307,\n \"acc_norm\": 0.8288633461047255,\n\
\ \"acc_norm_stderr\": 0.013468201614066307\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3452513966480447,\n\
\ \"acc_stderr\": 0.015901432608930354,\n \"acc_norm\": 0.3452513966480447,\n\
\ \"acc_norm_stderr\": 0.015901432608930354\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.02440439492808787,\n\
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.02440439492808787\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n\
\ \"acc_stderr\": 0.01274085387294983,\n \"acc_norm\": 0.4661016949152542,\n\
\ \"acc_norm_stderr\": 0.01274085387294983\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7242647058823529,\n \"acc_stderr\": 0.027146271936625166,\n\
\ \"acc_norm\": 0.7242647058823529,\n \"acc_norm_stderr\": 0.027146271936625166\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806304,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806304\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061463,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061463\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3929008567931457,\n\
\ \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.567957871171655,\n\
\ \"mc2_stderr\": 0.01545729191398638\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7932123125493291,\n \"acc_stderr\": 0.011382566829235798\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6671721000758151,\n \
\ \"acc_stderr\": 0.01297989249659828\n }\n}\n```"
repo_url: https://huggingface.co/flemmingmiguel/Mistrality-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|arc:challenge|25_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|gsm8k|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hellaswag|10_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T10-13-19.328780.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-11T10-13-19.328780.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- '**/details_harness|winogrande|5_2024-01-11T10-13-19.328780.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-11T10-13-19.328780.parquet'
- config_name: results
data_files:
- split: 2024_01_11T10_13_19.328780
path:
- results_2024-01-11T10-13-19.328780.parquet
- split: latest
path:
- results_2024-01-11T10-13-19.328780.parquet
---
# Dataset Card for Evaluation run of flemmingmiguel/Mistrality-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [flemmingmiguel/Mistrality-7B](https://huggingface.co/flemmingmiguel/Mistrality-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_flemmingmiguel__Mistrality-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-11T10:13:19.328780](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__Mistrality-7B/blob/main/results_2024-01-11T10-13-19.328780.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6491508355371155,
"acc_stderr": 0.031941087236083424,
"acc_norm": 0.6501709046743914,
"acc_norm_stderr": 0.03258488545513918,
"mc1": 0.3929008567931457,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.567957871171655,
"mc2_stderr": 0.01545729191398638
},
"harness|arc:challenge|25": {
"acc": 0.6279863481228669,
"acc_stderr": 0.014124597881844461,
"acc_norm": 0.6655290102389079,
"acc_norm_stderr": 0.013787460322441374
},
"harness|hellaswag|10": {
"acc": 0.6734714200358495,
"acc_stderr": 0.004679847503411342,
"acc_norm": 0.858195578570006,
"acc_norm_stderr": 0.00348136484077097
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.034765901043041336,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.034765901043041336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.036812296333943194,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.036812296333943194
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291932,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291932
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624734,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624734
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066307,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066307
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3452513966480447,
"acc_stderr": 0.015901432608930354,
"acc_norm": 0.3452513966480447,
"acc_norm_stderr": 0.015901432608930354
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.02440439492808787,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.02440439492808787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.01274085387294983,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.01274085387294983
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7242647058823529,
"acc_stderr": 0.027146271936625166,
"acc_norm": 0.7242647058823529,
"acc_norm_stderr": 0.027146271936625166
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806304,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806304
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061463,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061463
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3929008567931457,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.567957871171655,
"mc2_stderr": 0.01545729191398638
},
"harness|winogrande|5": {
"acc": 0.7932123125493291,
"acc_stderr": 0.011382566829235798
},
"harness|gsm8k|5": {
"acc": 0.6671721000758151,
"acc_stderr": 0.01297989249659828
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jqi/alpaca_nemo | ---
task_categories:
- text-generation
language:
- en
tags:
- SFT
size_categories:
- 10K<n<100K
---
Download alpaca in NeMo SFT Chat format
```
git lfs install
git clone https://huggingface.co/datasets/jqi/alpaca_nemo
```
Then you can find data file: `alpaca_nemo/alpaca_nemo.jsonl`
This file is about 32 MB in size.
To use it in NeMo, set the config:
```
data:
chat: True
chat_prompt_tokens:
system_turn_start: '<extra_id_0>'
turn_start: '<extra_id_1>'
label_start: '<extra_id_2>'
end_of_turn: "\x0A"
end_of_name: "\x0A"
train_ds:
file_names: [ 'alpaca_nemo/alpaca_nemo.jsonl' ]
``` |
hesamqqq111/WaterBodiesDataset | ---
license: apache-2.0
---
|
dotan1111/MSA-amino-7-seq | ---
tags:
- sequence-to-sequence
- bioinformatics
- biology
---
# Multiple Sequence Alignment as a Sequence-to-Sequence Learning Problem
## Abstract:
The sequence alignment problem is one of the most fundamental problems in bioinformatics and a plethora of methods were devised to tackle it. Here we introduce BetaAlign, a methodology for aligning sequences using an NLP approach. BetaAlign accounts for the possible variability of the evolutionary process among different datasets by using an ensemble of transformers, each trained on millions of samples generated from a different evolutionary model. Our approach leads to alignment accuracy that is similar and often better than commonly used methods, such as MAFFT, DIALIGN, ClustalW, T-Coffee, PRANK, and MUSCLE.

An illustration of aligning sequences with sequence-to-sequence learning. (a) Consider two input sequences "AAG" and "ACGG". (b) The result of encoding the unaligned sequences into the source language (*Concat* representation). (c) The sentence from the source language is translated to the target language via a transformer model. (d) The translated sentence in the target language (*Spaces* representation). (e) The resulting alignment, decoded from the translated sentence, in which "AA-G" is aligned to "ACGG". The transformer architecture illustration is adapted from (Vaswani et al., 2017).
## Data:
We used SpartaABC (Loewenthal et al., 2021) to generate millions of true alignments. SpartaABC requires the following input: (1) a rooted phylogenetic tree, which includes a topology and branch lengths; (2) a substitution model (amino acids or nucleotides); (3) root sequence length; (4) the indel model parameters, which include: insertion rate (*R_I*), deletion rate (*R_D*), a parameter for the insertion Zipfian distribution (*A_I*), and a parameter for the deletion Zipfian distribution (*A_D*). MSAs were simulated along random phylogenetic tree topologies generated using the program ETE version 3.0 (Huerta-Cepas et al., 2016) with default parameters.
We generated 1,495,000, 2,000 and 3,000, protein MSAs with ten sequences that were used as training validation and testing data, respectively. We generated the same number of DNA MSAs. For each random tree, branch lengths were drawn from a uniform distribution in the range *(0.5,1.0)*. Next, the sequences were generated using SpartaABC with the following parameters: *R_I,R_D \in (0.0,0.05)*, *A_I, A_D \in (1.01,2.0)*. The alignment lengths as well as the sequence lengths of the tree leaves vary within and among datasets as they depend on the indel dynamics and the root length. The root length was sampled uniformly in the range *[32,44]*. Unless stated otherwise, all protein datasets were generated with the WAG+G model, and all DNA datasets were generated with the GTR+G model, with the following parameters: (1) frequencies for the different nucleotides *(0.37, 0.166, 0.307, 0.158)*, in the order "T", "C", "A" and "G"; (2) with the substitutions rate *(0.444, 0.0843, 0.116, 0.107, 0.00027)*, in the order "a", "b", "c", "d", and "e" for the substitution matrix.
## Example:
The following example correspond for the illustrated MSA in the figure above:
{"MSA": "AAAC-GGG", "unaligned_seqs": {"seq0": "AAG", "seq1": "ACGG"}}
## APA
```
Dotan, E., Belinkov, Y., Avram, O., Wygoda, E., Ecker, N., Alburquerque, M., Keren, O., Loewenthal, G., & Pupko T. (2023). Multiple sequence alignment as a sequence-to-sequence learning problem. The Eleventh International Conference on Learning Representations (ICLR 2023).
```
## BibTeX
```
@article{Dotan_multiple_2023,
author = {Dotan, Edo and Belinkov, Yonatan and Avram, Oren and Wygoda, Elya and Ecker, Noa and Alburquerque, Michael and Keren, Omri and Loewenthal, Gil and Pupko, Tal},
month = aug,
title = {{Multiple sequence alignment as a sequence-to-sequence learning problem}},
year = {2023}
}
``` |
harishvs/ecommerce-product-llama2-QA | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 38057
num_examples: 27
download_size: 11979
dataset_size: 38057
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
trina731/pixel_zhwikidump_prerendered | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: num_patches
dtype: int64
splits:
- name: train
num_bytes: 65428051.875
num_examples: 1881
download_size: 65443181
dataset_size: 65428051.875
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ibranze/araproje_hellaswag_en_w5 | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 150148.07692307694
num_examples: 250
download_size: 82789
dataset_size: 150148.07692307694
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_en_w5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zhihz0535/X-SVAMP_en_zh_ko_it_es | ---
license: mit
configs:
- config_name: default
data_files:
- split: english
path: english.json
- split: chinese
path: chinese.json
- split: korean
path: korean.json
- split: italian
path: italian.json
- split: spanish
path: spanish.json
task_categories:
- question-answering
- text-generation
language:
- en
- zh
- it
- ko
- es
size_categories:
- 1K<n<10K
---
# X-SVAMP
[**🤗 Paper**](https://huggingface.co/papers/2311.08711) | [**📖 arXiv**](https://arxiv.org/abs/2311.08711)
### Dataset Description
X-SVAMP is an evaluation benchmark for multilingual large language models (LLMs), including questions and answers in 5 languages (English, Chinese, Korean, Italian and Spanish).
It is intended to evaluate the math reasoning abilities of LLMs. The dataset is translated by GPT-4-turbo from the original English-version SVAMP.
In our paper, we evaluate LLMs in a zero-shot generative setting: prompt the instruction-tuned LLM with the question, collect its response (a chain-of-thought rationale),
and let GPT-3.5-turbo extract the predicted answer from the response. Then, we compare the extracted answer with the reference answer to calculate accuracy.
Each question is appended with a chain-of-thought prompt. In English, it is `Think step-by-step before reaching the final answer`. Feel free to change this prompt if needed.
- **Languages:** English, Chinese, Korean, Italian, Spanish
- **License:** MIT
## Dataset Structure
Each example is composed of 3 fields:
- id: a numeric ID of the example. Examples in different languages with the same ID are translations to each other.
- instruction: The question for the language model. Each question is appended with a chain-of-thought prompt. Feel free to change this prompt if needed.
- answer: The reference answer to the question. SVAMP only includes non-negative integer answers.
## Citation [optional]
If you find the data useful, please kindly cite our paper:
```
@article{zhang2023plug,
title={PLUG: Leveraging Pivot Language in Cross-Lingual Instruction Tuning},
author={Zhang, Zhihan and Lee, Dong-Ho and Fang, Yuwei and Yu, Wenhao and Jia, Mengzhao and Jiang, Meng and Barbieri, Francesco},
journal={arXiv preprint arXiv:2311.08711},
year={2023}
}
``` |
krishnareddy/ner_samples | ---
license: mit
task_categories:
- token-classification
language:
- en
tags:
- medical
pretty_name: ner dataset format
size_categories:
- 10K<n<100K
---
This is a sample repository to upload data files related to NER task |
sberhe/2023-3-software-release-notes | ---
license: cc
---
|
sanjay920/function_calling_v1 | ---
dataset_info:
features:
- name: conversations
dtype: string
- name: tools
dtype: string
- name: tools_raw
dtype: string
splits:
- name: train
num_bytes: 241296424
num_examples: 100886
download_size: 96583756
dataset_size: 241296424
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
louisbrulenaudet/code-fonction-publique | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code général de la fonction publique
source_datasets:
- original
pretty_name: Code général de la fonction publique
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code général de la fonction publique, non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
vzach/bunny_test | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 10950
num_examples: 10
download_size: 10170
dataset_size: 10950
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_nlpguy__T3QM7XP | ---
pretty_name: Evaluation run of nlpguy/T3QM7XP
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nlpguy/T3QM7XP](https://huggingface.co/nlpguy/T3QM7XP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nlpguy__T3QM7XP\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T19:02:25.872345](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__T3QM7XP/blob/main/results_2024-03-22T19-02-25.872345.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6511911984965604,\n\
\ \"acc_stderr\": 0.03207056500750385,\n \"acc_norm\": 0.650135190413139,\n\
\ \"acc_norm_stderr\": 0.032747068948941103,\n \"mc1\": 0.6303549571603427,\n\
\ \"mc1_stderr\": 0.016898180706973878,\n \"mc2\": 0.7805646795719938,\n\
\ \"mc2_stderr\": 0.013687445864948983\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.712457337883959,\n \"acc_stderr\": 0.013226719056266127,\n\
\ \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.012968040686869148\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7178848834893448,\n\
\ \"acc_stderr\": 0.004491093528113408,\n \"acc_norm\": 0.8911571400119498,\n\
\ \"acc_norm_stderr\": 0.003108054563352107\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0303883535518868,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0303883535518868\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659806,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834841,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834841\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4245810055865922,\n\
\ \"acc_stderr\": 0.016531170993278888,\n \"acc_norm\": 0.4245810055865922,\n\
\ \"acc_norm_stderr\": 0.016531170993278888\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4745762711864407,\n\
\ \"acc_stderr\": 0.012753716929101006,\n \"acc_norm\": 0.4745762711864407,\n\
\ \"acc_norm_stderr\": 0.012753716929101006\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6303549571603427,\n\
\ \"mc1_stderr\": 0.016898180706973878,\n \"mc2\": 0.7805646795719938,\n\
\ \"mc2_stderr\": 0.013687445864948983\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8500394632991318,\n \"acc_stderr\": 0.010034394804580809\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7058377558756633,\n \
\ \"acc_stderr\": 0.012551285331470157\n }\n}\n```"
repo_url: https://huggingface.co/nlpguy/T3QM7XP
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|arc:challenge|25_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|gsm8k|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hellaswag|10_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T19-02-25.872345.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T19-02-25.872345.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- '**/details_harness|winogrande|5_2024-03-22T19-02-25.872345.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T19-02-25.872345.parquet'
- config_name: results
data_files:
- split: 2024_03_22T19_02_25.872345
path:
- results_2024-03-22T19-02-25.872345.parquet
- split: latest
path:
- results_2024-03-22T19-02-25.872345.parquet
---
# Dataset Card for Evaluation run of nlpguy/T3QM7XP
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nlpguy/T3QM7XP](https://huggingface.co/nlpguy/T3QM7XP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nlpguy__T3QM7XP",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T19:02:25.872345](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__T3QM7XP/blob/main/results_2024-03-22T19-02-25.872345.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6511911984965604,
"acc_stderr": 0.03207056500750385,
"acc_norm": 0.650135190413139,
"acc_norm_stderr": 0.032747068948941103,
"mc1": 0.6303549571603427,
"mc1_stderr": 0.016898180706973878,
"mc2": 0.7805646795719938,
"mc2_stderr": 0.013687445864948983
},
"harness|arc:challenge|25": {
"acc": 0.712457337883959,
"acc_stderr": 0.013226719056266127,
"acc_norm": 0.7303754266211604,
"acc_norm_stderr": 0.012968040686869148
},
"harness|hellaswag|10": {
"acc": 0.7178848834893448,
"acc_stderr": 0.004491093528113408,
"acc_norm": 0.8911571400119498,
"acc_norm_stderr": 0.003108054563352107
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778398,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.0303883535518868,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.0303883535518868
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659806,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834841,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834841
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4245810055865922,
"acc_stderr": 0.016531170993278888,
"acc_norm": 0.4245810055865922,
"acc_norm_stderr": 0.016531170993278888
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4745762711864407,
"acc_stderr": 0.012753716929101006,
"acc_norm": 0.4745762711864407,
"acc_norm_stderr": 0.012753716929101006
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6303549571603427,
"mc1_stderr": 0.016898180706973878,
"mc2": 0.7805646795719938,
"mc2_stderr": 0.013687445864948983
},
"harness|winogrande|5": {
"acc": 0.8500394632991318,
"acc_stderr": 0.010034394804580809
},
"harness|gsm8k|5": {
"acc": 0.7058377558756633,
"acc_stderr": 0.012551285331470157
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ml6team/the-stack-smol-python | ---
dataset_info:
features:
- name: content
dtype: string
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
- name: alphanum_fraction
dtype: float64
- name: licenses
sequence: string
- name: repository_name
dtype: string
- name: path
dtype: string
- name: size
dtype: int64
- name: lang
dtype: string
splits:
- name: train
num_bytes: 82161631
num_examples: 10000
download_size: 28757440
dataset_size: 82161631
---
# Dataset Card for "the-stack-smol-python"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-dpo-e3 | ---
pretty_name: Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e3](https://huggingface.co/silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-dpo-e3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-22T17:17:10.255551](https://huggingface.co/datasets/open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-dpo-e3/blob/main/results_2024-01-22T17-17-10.255551.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6076374032172701,\n\
\ \"acc_stderr\": 0.0331629731019256,\n \"acc_norm\": 0.6120606501518099,\n\
\ \"acc_norm_stderr\": 0.03383578080966383,\n \"mc1\": 0.5581395348837209,\n\
\ \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.7059182813774988,\n\
\ \"mc2_stderr\": 0.01504259695078292\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5878839590443686,\n \"acc_stderr\": 0.014383915302225407,\n\
\ \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.014137708601759084\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6750647281418044,\n\
\ \"acc_stderr\": 0.004673934837150448,\n \"acc_norm\": 0.8531169089822744,\n\
\ \"acc_norm_stderr\": 0.0035326587973575525\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.037507570448955356,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.037507570448955356\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726367,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726367\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.02497695405315525,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.02497695405315525\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5935483870967742,\n \"acc_stderr\": 0.02794172734625631,\n \"\
acc_norm\": 0.5935483870967742,\n \"acc_norm_stderr\": 0.02794172734625631\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124484,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124484\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630644,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630644\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5769230769230769,\n \"acc_stderr\": 0.025049197876042345,\n\
\ \"acc_norm\": 0.5769230769230769,\n \"acc_norm_stderr\": 0.025049197876042345\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608456,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608456\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7944954128440367,\n \"acc_stderr\": 0.017324352325016012,\n \"\
acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.017324352325016012\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847836,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847836\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690879,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690879\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7816091954022989,\n\
\ \"acc_stderr\": 0.014774358319934488,\n \"acc_norm\": 0.7816091954022989,\n\
\ \"acc_norm_stderr\": 0.014774358319934488\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30614525139664805,\n\
\ \"acc_stderr\": 0.015414494487903227,\n \"acc_norm\": 0.30614525139664805,\n\
\ \"acc_norm_stderr\": 0.015414494487903227\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.02671611838015685,\n\
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.02671611838015685\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.025557653981868045,\n\
\ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.025557653981868045\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4380704041720991,\n\
\ \"acc_stderr\": 0.012671902782567657,\n \"acc_norm\": 0.4380704041720991,\n\
\ \"acc_norm_stderr\": 0.012671902782567657\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.029349803139765873,\n\
\ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.029349803139765873\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.630718954248366,\n \"acc_stderr\": 0.019524316744866353,\n \
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.019524316744866353\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n\
\ \"acc_stderr\": 0.032510068164586174,\n \"acc_norm\": 0.6965174129353234,\n\
\ \"acc_norm_stderr\": 0.032510068164586174\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"\
acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\"\
: 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\":\
\ {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5581395348837209,\n\
\ \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.7059182813774988,\n\
\ \"mc2_stderr\": 0.01504259695078292\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698338\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39727065959059893,\n \
\ \"acc_stderr\": 0.013478659652337787\n }\n}\n```"
repo_url: https://huggingface.co/silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|arc:challenge|25_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|gsm8k|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hellaswag|10_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T17-17-10.255551.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-22T17-17-10.255551.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- '**/details_harness|winogrande|5_2024-01-22T17-17-10.255551.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-22T17-17-10.255551.parquet'
- config_name: results
data_files:
- split: 2024_01_22T17_17_10.255551
path:
- results_2024-01-22T17-17-10.255551.parquet
- split: latest
path:
- results_2024-01-22T17-17-10.255551.parquet
---
# Dataset Card for Evaluation run of silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e3](https://huggingface.co/silvercoder45/Mistral-7b-instruct-v0.2-summ-dpo-e3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-dpo-e3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T17:17:10.255551](https://huggingface.co/datasets/open-llm-leaderboard/details_silvercoder45__Mistral-7b-instruct-v0.2-summ-dpo-e3/blob/main/results_2024-01-22T17-17-10.255551.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6076374032172701,
"acc_stderr": 0.0331629731019256,
"acc_norm": 0.6120606501518099,
"acc_norm_stderr": 0.03383578080966383,
"mc1": 0.5581395348837209,
"mc1_stderr": 0.01738476747898621,
"mc2": 0.7059182813774988,
"mc2_stderr": 0.01504259695078292
},
"harness|arc:challenge|25": {
"acc": 0.5878839590443686,
"acc_stderr": 0.014383915302225407,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.014137708601759084
},
"harness|hellaswag|10": {
"acc": 0.6750647281418044,
"acc_stderr": 0.004673934837150448,
"acc_norm": 0.8531169089822744,
"acc_norm_stderr": 0.0035326587973575525
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.037507570448955356,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.037507570448955356
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726367,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726367
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.02497695405315525,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.02497695405315525
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5935483870967742,
"acc_stderr": 0.02794172734625631,
"acc_norm": 0.5935483870967742,
"acc_norm_stderr": 0.02794172734625631
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124484,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124484
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.02541634309630644,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.02541634309630644
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5769230769230769,
"acc_stderr": 0.025049197876042345,
"acc_norm": 0.5769230769230769,
"acc_norm_stderr": 0.025049197876042345
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608456,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608456
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.017324352325016012,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.017324352325016012
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690879,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690879
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597552,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597552
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7816091954022989,
"acc_stderr": 0.014774358319934488,
"acc_norm": 0.7816091954022989,
"acc_norm_stderr": 0.014774358319934488
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30614525139664805,
"acc_stderr": 0.015414494487903227,
"acc_norm": 0.30614525139664805,
"acc_norm_stderr": 0.015414494487903227
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.02671611838015685,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.02671611838015685
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.025557653981868045,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.025557653981868045
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4380704041720991,
"acc_stderr": 0.012671902782567657,
"acc_norm": 0.4380704041720991,
"acc_norm_stderr": 0.012671902782567657
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.029349803139765873,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.029349803139765873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.019524316744866353,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.019524316744866353
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6965174129353234,
"acc_stderr": 0.032510068164586174,
"acc_norm": 0.6965174129353234,
"acc_norm_stderr": 0.032510068164586174
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5581395348837209,
"mc1_stderr": 0.01738476747898621,
"mc2": 0.7059182813774988,
"mc2_stderr": 0.01504259695078292
},
"harness|winogrande|5": {
"acc": 0.7734806629834254,
"acc_stderr": 0.011764149054698338
},
"harness|gsm8k|5": {
"acc": 0.39727065959059893,
"acc_stderr": 0.013478659652337787
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
hxue3/autotrain-data-code_summarization | ---
language:
- en
task_categories:
- conditional-text-generation
---
# AutoTrain Dataset for project: code_summarization
## Dataset Descritpion
This dataset has been automatically processed by AutoTrain for project code_summarization.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "def read(self, table, columns, keyset, index=\"\", limit=0, partition=None):\n \"\"\"Perform a ``St[...]",
"target": "Perform a ``StreamingRead`` API request for rows in a table.\n\n :type table: str\n :para[...]"
},
{
"text": "def maf_somatic_variant_stats(variant, variant_metadata):\n \"\"\"\n Parse out the variant calling [...]",
"target": "Parse out the variant calling statistics for a given variant from a MAF file\n\n Assumes the MAF fo[...]"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 800 |
| valid | 200 |
|
Deojoandco/capstone_fromgpt_without_gold_v10_all | ---
dataset_info:
features:
- name: dialog_id
dtype: int64
- name: dialogue
dtype: string
- name: summary
dtype: string
- name: gold_tags
dtype: string
- name: gpt_success
dtype: bool
- name: gpt_response
dtype: string
- name: gold_tags_tokens_count
dtype: int64
- name: GPT_TAGS_FOUND
dtype: bool
- name: gpt_output_tags
dtype: string
- name: gpt_output_tag_tokens_count
dtype: int64
- name: GPT_MI_FOUND
dtype: bool
- name: gpt_tags_token_count
dtype: int64
- name: gpt_tags
dtype: string
- name: tag_token_count_match
dtype: bool
splits:
- name: train
num_bytes: 124032
num_examples: 76
- name: validation
num_bytes: 23025
num_examples: 12
- name: test
num_bytes: 14556
num_examples: 12
download_size: 82408
dataset_size: 161613
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
# Dataset Card for "capstone_fromgpt_without_gold_v10_all"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rolisz/roekk_13_0_ro_pseudo_labelled | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
- name: whisper_transcript
dtype: string
splits:
- name: train
num_bytes: 427119628.856
num_examples: 3192
download_size: 426665532
dataset_size: 427119628.856
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lshowway/reorder.sov.es | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1091423892
num_examples: 566216
download_size: 514178300
dataset_size: 1091423892
---
# Dataset Card for "reorder.sov.es"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jamesagilesoda/ko-corpus-cleaned-12653878 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: clean
num_bytes: 100325969043
num_examples: 12653878
- name: noisy
num_bytes: 144185494007
num_examples: 19713534
download_size: 139825706722
dataset_size: 244511463050
configs:
- config_name: default
data_files:
- split: clean
path: data/clean-*
- split: noisy
path: data/noisy-*
---
|
Norod78/muppetshow-blip-captions | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 495380043.0
num_examples: 402
download_size: 495385822
dataset_size: 495380043.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "muppetshow-blip-captions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
existence-master/bloomify-classification-external | ---
license: gpl-3.0
---
|
open-llm-leaderboard/details_andysalerno__openchat-nectar-0.14 | ---
pretty_name: Evaluation run of andysalerno/openchat-nectar-0.14
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [andysalerno/openchat-nectar-0.14](https://huggingface.co/andysalerno/openchat-nectar-0.14)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andysalerno__openchat-nectar-0.14\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-23T23:09:38.113022](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.14/blob/main/results_2024-01-23T23-09-38.113022.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6488818534853035,\n\
\ \"acc_stderr\": 0.03208696726323281,\n \"acc_norm\": 0.6491291937563749,\n\
\ \"acc_norm_stderr\": 0.03275329412199132,\n \"mc1\": 0.3317013463892289,\n\
\ \"mc1_stderr\": 0.016482148810241477,\n \"mc2\": 0.500920468272256,\n\
\ \"mc2_stderr\": 0.015353134413860351\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.621160409556314,\n \"acc_stderr\": 0.014175915490000324,\n\
\ \"acc_norm\": 0.6561433447098977,\n \"acc_norm_stderr\": 0.013880644570156218\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6361282613025294,\n\
\ \"acc_stderr\": 0.004801290954387085,\n \"acc_norm\": 0.8302131049591714,\n\
\ \"acc_norm_stderr\": 0.0037467817125096527\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n\
\ \"acc_stderr\": 0.04024778401977109,\n \"acc_norm\": 0.6814814814814815,\n\
\ \"acc_norm_stderr\": 0.04024778401977109\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04082482904638629,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04082482904638629\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n\
\ \"acc_stderr\": 0.02540255550326091,\n \"acc_norm\": 0.41798941798941797,\n\
\ \"acc_norm_stderr\": 0.02540255550326091\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n\
\ \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.02302589961718872,\n\
\ \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.02302589961718872\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276875,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276875\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616258,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616258\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.02615686752393104,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02615686752393104\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503228,\n \
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503228\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"\
acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406978,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406978\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608308,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608308\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.20670391061452514,\n\
\ \"acc_stderr\": 0.013543260867834455,\n \"acc_norm\": 0.20670391061452514,\n\
\ \"acc_norm_stderr\": 0.013543260867834455\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4817470664928292,\n\
\ \"acc_stderr\": 0.012761723960595472,\n \"acc_norm\": 0.4817470664928292,\n\
\ \"acc_norm_stderr\": 0.012761723960595472\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.027365861131513812,\n\
\ \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.027365861131513812\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3317013463892289,\n\
\ \"mc1_stderr\": 0.016482148810241477,\n \"mc2\": 0.500920468272256,\n\
\ \"mc2_stderr\": 0.015353134413860351\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8200473559589582,\n \"acc_stderr\": 0.01079646868806868\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6921910538286581,\n \
\ \"acc_stderr\": 0.012714401009923644\n }\n}\n```"
repo_url: https://huggingface.co/andysalerno/openchat-nectar-0.14
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|arc:challenge|25_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|gsm8k|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hellaswag|10_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T23-09-38.113022.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T23-09-38.113022.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- '**/details_harness|winogrande|5_2024-01-23T23-09-38.113022.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-23T23-09-38.113022.parquet'
- config_name: results
data_files:
- split: 2024_01_23T23_09_38.113022
path:
- results_2024-01-23T23-09-38.113022.parquet
- split: latest
path:
- results_2024-01-23T23-09-38.113022.parquet
---
# Dataset Card for Evaluation run of andysalerno/openchat-nectar-0.14
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [andysalerno/openchat-nectar-0.14](https://huggingface.co/andysalerno/openchat-nectar-0.14) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_andysalerno__openchat-nectar-0.14",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T23:09:38.113022](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.14/blob/main/results_2024-01-23T23-09-38.113022.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6488818534853035,
"acc_stderr": 0.03208696726323281,
"acc_norm": 0.6491291937563749,
"acc_norm_stderr": 0.03275329412199132,
"mc1": 0.3317013463892289,
"mc1_stderr": 0.016482148810241477,
"mc2": 0.500920468272256,
"mc2_stderr": 0.015353134413860351
},
"harness|arc:challenge|25": {
"acc": 0.621160409556314,
"acc_stderr": 0.014175915490000324,
"acc_norm": 0.6561433447098977,
"acc_norm_stderr": 0.013880644570156218
},
"harness|hellaswag|10": {
"acc": 0.6361282613025294,
"acc_stderr": 0.004801290954387085,
"acc_norm": 0.8302131049591714,
"acc_norm_stderr": 0.0037467817125096527
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.04024778401977109,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.04024778401977109
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.04082482904638629,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04082482904638629
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718872,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718872
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276875,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276875
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616258,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616258
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02615686752393104,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02615686752393104
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503228,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503228
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917671,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406978,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406978
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608308,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608308
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.20670391061452514,
"acc_stderr": 0.013543260867834455,
"acc_norm": 0.20670391061452514,
"acc_norm_stderr": 0.013543260867834455
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4817470664928292,
"acc_stderr": 0.012761723960595472,
"acc_norm": 0.4817470664928292,
"acc_norm_stderr": 0.012761723960595472
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.027365861131513812,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.027365861131513812
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3317013463892289,
"mc1_stderr": 0.016482148810241477,
"mc2": 0.500920468272256,
"mc2_stderr": 0.015353134413860351
},
"harness|winogrande|5": {
"acc": 0.8200473559589582,
"acc_stderr": 0.01079646868806868
},
"harness|gsm8k|5": {
"acc": 0.6921910538286581,
"acc_stderr": 0.012714401009923644
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Cohere/miracl-ru-corpus-22-12 | ---
annotations_creators:
- expert-generated
language:
- ru
multilinguality:
- multilingual
size_categories: []
source_datasets: []
tags: []
task_categories:
- text-retrieval
license:
- apache-2.0
task_ids:
- document-retrieval
---
# MIRACL (ru) embedded with cohere.ai `multilingual-22-12` encoder
We encoded the [MIRACL dataset](https://huggingface.co/miracl) using the [cohere.ai](https://txt.cohere.ai/multilingual/) `multilingual-22-12` embedding model.
The query embeddings can be found in [Cohere/miracl-ru-queries-22-12](https://huggingface.co/datasets/Cohere/miracl-ru-queries-22-12) and the corpus embeddings can be found in [Cohere/miracl-ru-corpus-22-12](https://huggingface.co/datasets/Cohere/miracl-ru-corpus-22-12).
For the orginal datasets, see [miracl/miracl](https://huggingface.co/datasets/miracl/miracl) and [miracl/miracl-corpus](https://huggingface.co/datasets/miracl/miracl-corpus).
Dataset info:
> MIRACL 🌍🙌🌏 (Multilingual Information Retrieval Across a Continuum of Languages) is a multilingual retrieval dataset that focuses on search across 18 different languages, which collectively encompass over three billion native speakers around the world.
>
> The corpus for each language is prepared from a Wikipedia dump, where we keep only the plain text and discard images, tables, etc. Each article is segmented into multiple passages using WikiExtractor based on natural discourse units (e.g., `\n\n` in the wiki markup). Each of these passages comprises a "document" or unit of retrieval. We preserve the Wikipedia article title of each passage.
## Embeddings
We compute for `title+" "+text` the embeddings using our `multilingual-22-12` embedding model, a state-of-the-art model that works for semantic search in 100 languages. If you want to learn more about this model, have a look at [cohere.ai multilingual embedding model](https://txt.cohere.ai/multilingual/).
## Loading the dataset
In [miracl-ru-corpus-22-12](https://huggingface.co/datasets/Cohere/miracl-ru-corpus-22-12) we provide the corpus embeddings. Note, depending on the selected split, the respective files can be quite large.
You can either load the dataset like this:
```python
from datasets import load_dataset
docs = load_dataset(f"Cohere/miracl-ru-corpus-22-12", split="train")
```
Or you can also stream it without downloading it before:
```python
from datasets import load_dataset
docs = load_dataset(f"Cohere/miracl-ru-corpus-22-12", split="train", streaming=True)
for doc in docs:
docid = doc['docid']
title = doc['title']
text = doc['text']
emb = doc['emb']
```
## Search
Have a look at [miracl-ru-queries-22-12](https://huggingface.co/datasets/Cohere/miracl-ru-queries-22-12) where we provide the query embeddings for the MIRACL dataset.
To search in the documents, you must use **dot-product**.
And then compare this query embeddings either with a vector database (recommended) or directly computing the dot product.
A full search example:
```python
# Attention! For large datasets, this requires a lot of memory to store
# all document embeddings and to compute the dot product scores.
# Only use this for smaller datasets. For large datasets, use a vector DB
from datasets import load_dataset
import torch
#Load documents + embeddings
docs = load_dataset(f"Cohere/miracl-ru-corpus-22-12", split="train")
doc_embeddings = torch.tensor(docs['emb'])
# Load queries
queries = load_dataset(f"Cohere/miracl-ru-queries-22-12", split="dev")
# Select the first query as example
qid = 0
query = queries[qid]
query_embedding = torch.tensor(queries['emb'])
# Compute dot score between query embedding and document embeddings
dot_scores = torch.mm(query_embedding, doc_embeddings.transpose(0, 1))
top_k = torch.topk(dot_scores, k=3)
# Print results
print("Query:", query['query'])
for doc_id in top_k.indices[0].tolist():
print(docs[doc_id]['title'])
print(docs[doc_id]['text'])
```
You can get embeddings for new queries using our API:
```python
#Run: pip install cohere
import cohere
co = cohere.Client(f"{api_key}") # You should add your cohere API Key here :))
texts = ['my search query']
response = co.embed(texts=texts, model='multilingual-22-12')
query_embedding = response.embeddings[0] # Get the embedding for the first text
```
## Performance
In the following table we compare the cohere multilingual-22-12 model with Elasticsearch version 8.6.0 lexical search (title and passage indexed as independent fields). Note that Elasticsearch doesn't support all languages that are part of the MIRACL dataset.
We compute nDCG@10 (a ranking based loss), as well as hit@3: Is at least one relevant document in the top-3 results. We find that hit@3 is easier to interpret, as it presents the number of queries for which a relevant document is found among the top-3 results.
Note: MIRACL only annotated a small fraction of passages (10 per query) for relevancy. Especially for larger Wikipedias (like English), we often found many more relevant passages. This is know as annotation holes. Real nDCG@10 and hit@3 performance is likely higher than depicted.
| Model | cohere multilingual-22-12 nDCG@10 | cohere multilingual-22-12 hit@3 | ES 8.6.0 nDCG@10 | ES 8.6.0 acc@3 |
|---|---|---|---|---|
| miracl-ar | 64.2 | 75.2 | 46.8 | 56.2 |
| miracl-bn | 61.5 | 75.7 | 49.2 | 60.1 |
| miracl-de | 44.4 | 60.7 | 19.6 | 29.8 |
| miracl-en | 44.6 | 62.2 | 30.2 | 43.2 |
| miracl-es | 47.0 | 74.1 | 27.0 | 47.2 |
| miracl-fi | 63.7 | 76.2 | 51.4 | 61.6 |
| miracl-fr | 46.8 | 57.1 | 17.0 | 21.6 |
| miracl-hi | 50.7 | 62.9 | 41.0 | 48.9 |
| miracl-id | 44.8 | 63.8 | 39.2 | 54.7 |
| miracl-ru | 49.2 | 66.9 | 25.4 | 36.7 |
| **Avg** | 51.7 | 67.5 | 34.7 | 46.0 |
Further languages (not supported by Elasticsearch):
| Model | cohere multilingual-22-12 nDCG@10 | cohere multilingual-22-12 hit@3 |
|---|---|---|
| miracl-fa | 44.8 | 53.6 |
| miracl-ja | 49.0 | 61.0 |
| miracl-ko | 50.9 | 64.8 |
| miracl-sw | 61.4 | 74.5 |
| miracl-te | 67.8 | 72.3 |
| miracl-th | 60.2 | 71.9 |
| miracl-yo | 56.4 | 62.2 |
| miracl-zh | 43.8 | 56.5 |
| **Avg** | 54.3 | 64.6 |
|
alkav/customerfeedbacks-llama2-80 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 12944
num_examples: 80
download_size: 7900
dataset_size: 12944
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
aekilica/Alpaca_Dolly | ---
license: apache-2.0
task_categories:
- table-question-answering
- question-answering
language:
- tr
size_categories:
- 10K<n<100K
--- |
CyberHarem/xu_fu_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of xu_fu/徐福/徐福 (Fate/Grand Order)
This is the dataset of xu_fu/徐福/徐福 (Fate/Grand Order), containing 243 images and their tags.
The core tags of this character are `black_hair, hair_over_one_eye, mole, mole_under_eye, breasts, small_breasts, hat, short_hair, hair_ornament, brown_eyes, blue_headwear, medium_hair, straight_hair, tassel`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 243 | 409.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/xu_fu_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 243 | 345.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/xu_fu_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 550 | 673.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/xu_fu_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/xu_fu_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 28 |  |  |  |  |  | 1girl, black_gloves, chinese_clothes, hair_stick, holding_doll, long_sleeves, solo, wide_sleeves, character_doll, looking_at_viewer, gem, fingerless_gloves, bridal_gauntlets, smile, white_robe, hair_between_eyes, upper_body |
| 1 | 5 |  |  |  |  |  | 1girl, black_headwear, blush, character_doll, holding_doll, long_sleeves, smile, solo, white_robe, bridal_gauntlets, looking_at_viewer, chinese_clothes, open_mouth |
| 2 | 5 |  |  |  |  |  | 1girl, bracelet, bridal_gauntlets, chinese_clothes, holding_doll, long_sleeves, looking_at_viewer, solo, white_robe, black_headwear, character_doll, orb, grin, wide_sleeves, yellow_eyes |
| 3 | 9 |  |  |  |  |  | 1girl, black_headwear, chinese_clothes, long_sleeves, looking_at_viewer, solo, white_robe, bridal_gauntlets, bracelet, wide_sleeves, simple_background, smile, upper_body, white_background |
| 4 | 5 |  |  |  |  |  | 1girl, black_shorts, holding_cup, solo, beer_mug, long_sleeves, looking_at_viewer, off_shoulder, open_mouth, shoulder_cutout, white_shirt, hair_ribbon, hairband, open_jacket, smile, argyle_cutout, blush, pantyhose_under_shorts, white_robe |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | chinese_clothes | hair_stick | holding_doll | long_sleeves | solo | wide_sleeves | character_doll | looking_at_viewer | gem | fingerless_gloves | bridal_gauntlets | smile | white_robe | hair_between_eyes | upper_body | black_headwear | blush | open_mouth | bracelet | orb | grin | yellow_eyes | simple_background | white_background | black_shorts | holding_cup | beer_mug | off_shoulder | shoulder_cutout | white_shirt | hair_ribbon | hairband | open_jacket | argyle_cutout | pantyhose_under_shorts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:------------------|:-------------|:---------------|:---------------|:-------|:---------------|:-----------------|:--------------------|:------|:--------------------|:-------------------|:--------|:-------------|:--------------------|:-------------|:-----------------|:--------|:-------------|:-----------|:------|:-------|:--------------|:--------------------|:-------------------|:---------------|:--------------|:-----------|:---------------|:------------------|:--------------|:--------------|:-----------|:--------------|:----------------|:-------------------------|
| 0 | 28 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | X | | X | X | X | | X | X | | | X | X | X | | | X | X | X | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | | X | X | X | X | X | X | | | X | | X | | | X | | | X | X | X | X | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | | X | | | X | X | X | | X | | | X | X | X | | X | X | | | X | | | | X | X | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | | X | X | | | X | | | | X | X | | | | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
orgcatorg/army-technology | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: content
dtype: string
- name: title
dtype: string
- name: source_link
dtype: string
- name: description
dtype: string
- name: date
dtype: string
- name: image
dtype: string
splits:
- name: train
num_bytes: 481578
num_examples: 108
download_size: 274744
dataset_size: 481578
---
# Dataset Card for "army-technology"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nupurkmr9/tortoise | ---
license: mit
---
|
eengel7/sentiment_analysis_training_test | ---
license: apache-2.0
---
|
kainever7/sfsf | ---
license: apache-2.0
---
|
rb58853/caption_image | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 609045.0
num_examples: 5
download_size: 609068
dataset_size: 609045.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
code_x_glue_cc_code_refinement | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- code
license:
- c-uda
multilinguality:
- other-programming-languages
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text2text-generation
task_ids: []
pretty_name: CodeXGlueCcCodeRefinement
tags:
- debugging
dataset_info:
- config_name: medium
features:
- name: id
dtype: int32
- name: buggy
dtype: string
- name: fixed
dtype: string
splits:
- name: train
num_bytes: 32614786
num_examples: 52364
- name: validation
num_bytes: 4086733
num_examples: 6546
- name: test
num_bytes: 4063665
num_examples: 6545
download_size: 14929559
dataset_size: 40765184
- config_name: small
features:
- name: id
dtype: int32
- name: buggy
dtype: string
- name: fixed
dtype: string
splits:
- name: train
num_bytes: 13006679
num_examples: 46680
- name: validation
num_bytes: 1629242
num_examples: 5835
- name: test
num_bytes: 1619700
num_examples: 5835
download_size: 5894462
dataset_size: 16255621
configs:
- config_name: medium
data_files:
- split: train
path: medium/train-*
- split: validation
path: medium/validation-*
- split: test
path: medium/test-*
- config_name: small
data_files:
- split: train
path: small/train-*
- split: validation
path: small/validation-*
- split: test
path: small/test-*
---
# Dataset Card for "code_x_glue_cc_code_refinement"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits-sample-size)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/microsoft/CodeXGLUE/tree/main/Code-Code/code-refinement
- **Paper:** https://arxiv.org/abs/2102.04664
### Dataset Summary
CodeXGLUE code-refinement dataset, available at https://github.com/microsoft/CodeXGLUE/tree/main/Code-Code/code-refinement
We use the dataset released by this paper(https://arxiv.org/pdf/1812.08693.pdf). The source side is a Java function with bugs and the target side is the refined one. All the function and variable names are normalized. Their dataset contains two subsets ( i.e.small and medium) based on the function length.
### Supported Tasks and Leaderboards
- `text2text-generation-other-debugging`: The dataset can be used to train a model for automatically fixing buggy code.
### Languages
- Java **programming** language
## Dataset Structure
### Data Instances
#### medium
An example of 'train' looks as follows.
```
{
"buggy": "public static TYPE_1 init ( java.lang.String name , java.util.Date date ) { TYPE_1 VAR_1 = new TYPE_1 ( ) ; VAR_1 . METHOD_1 ( name ) ; java.util.Calendar VAR_2 = java.util.Calendar.getInstance ( ) ; VAR_2 . METHOD_2 ( date ) ; VAR_1 . METHOD_3 ( VAR_2 ) ; return VAR_1 ; }\n",
"fixed": "public static TYPE_1 init ( java.lang.String name , java.util.Date date ) { TYPE_1 VAR_1 = new TYPE_1 ( ) ; VAR_1 . METHOD_1 ( name ) ; java.util.Calendar VAR_2 = null ; if ( date != null ) { VAR_2 = java.util.Calendar.getInstance ( ) ; VAR_2 . METHOD_2 ( date ) ; } VAR_1 . METHOD_3 ( VAR_2 ) ; return VAR_1 ; }\n",
"id": 0
}
```
#### small
An example of 'validation' looks as follows.
```
{
"buggy": "public java.util.List < TYPE_1 > METHOD_1 ( ) { java.util.ArrayList < TYPE_1 > VAR_1 = new java.util.ArrayList < TYPE_1 > ( ) ; for ( TYPE_2 VAR_2 : VAR_3 ) { VAR_1 . METHOD_2 ( VAR_2 . METHOD_1 ( ) ) ; } return VAR_1 ; } \n",
"fixed": "public java.util.List < TYPE_1 > METHOD_1 ( ) { return VAR_1 ; } \n",
"id": 0
}
```
### Data Fields
In the following each data field in go is explained for each config. The data fields are the same among all splits.
#### medium, small
|field name| type | description |
|----------|------|--------------------------------|
|id |int32 | Index of the sample |
|buggy |string| The buggy version of the code |
|fixed |string| The correct version of the code|
### Data Splits
| name |train|validation|test|
|------|----:|---------:|---:|
|medium|52364| 6546|6545|
|small |46680| 5835|5835|
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
Downloaded from GitHub Archive every public GitHub event between March 2011 and October 2017 and used the Google BigQuery APIs.
[More Information Needed]
#### Who are the source language producers?
Software Engineering developers.
### Annotations
#### Annotation process
Automatically annotated by filtering commit messages containing the pattern: ("fix" or "solve") and ("bug" or "issue" or "problem" or "error"). A statistically significant amount of samples (95% confidence level with 5% confidence interval) were manually evaluated by two authors to check if the filtered bug/fix pairs were correct. After all disagreements were settled, authors conclude that 97.6% were true positives.
#### Who are the annotators?
Heuristics and the authors of the paper.
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
https://github.com/microsoft, https://github.com/madlag
### Licensing Information
Computational Use of Data Agreement (C-UDA) License.
### Citation Information
```
@article{DBLP:journals/corr/abs-2102-04664,
author = {Shuai Lu and
Daya Guo and
Shuo Ren and
Junjie Huang and
Alexey Svyatkovskiy and
Ambrosio Blanco and
Colin B. Clement and
Dawn Drain and
Daxin Jiang and
Duyu Tang and
Ge Li and
Lidong Zhou and
Linjun Shou and
Long Zhou and
Michele Tufano and
Ming Gong and
Ming Zhou and
Nan Duan and
Neel Sundaresan and
Shao Kun Deng and
Shengyu Fu and
Shujie Liu},
title = {CodeXGLUE: {A} Machine Learning Benchmark Dataset for Code Understanding
and Generation},
journal = {CoRR},
volume = {abs/2102.04664},
year = {2021}
}
@article{tufano2019empirical,
title={An empirical study on learning bug-fixing patches in the wild via neural machine translation},
author={Tufano, Michele and Watson, Cody and Bavota, Gabriele and Penta, Massimiliano Di and White, Martin and Poshyvanyk, Denys},
journal={ACM Transactions on Software Engineering and Methodology (TOSEM)},
volume={28},
number={4},
pages={1--29},
year={2019},
publisher={ACM New York, NY, USA}
}
```
### Contributions
Thanks to @madlag (and partly also @ncoop57) for adding this dataset. |
sangjeedondrub/tmg-datasets | ---
dataset_info:
features:
- name: raw
dtype: string
- name: accompaniment
dtype: string
- name: pitch
dtype: string
- name: vocal
dtype: string
splits:
- name: train
num_bytes: 50676
num_examples: 246
download_size: 41377
dataset_size: 50676
---
# Dataset Card for "tmg-datasets"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_72 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 24985638576.875
num_examples: 260137
download_size: 22601140207
dataset_size: 24985638576.875
---
# Dataset Card for "chunk_72"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lumilogic/climateBUG-Data | ---
task_categories:
- text-classification
language:
- en
tags:
- climate
- finance
- banking
- EU
pretty_name: climateBUG-Data
size_categories:
- 1M<n<10M
license: "cc-by-nc-nd-4.0"
dataset_info:
features:
- name: statement
dtype: string
- name: year
dtype: int64
- name: label
dtype:
class_label:
names:
'0': non-climate
'1': climate
- name: manual
dtype: bool
configs:
- config_name: default
data_files:
- split: train
path: "data/climatebug-data-train.parquet"
- split: test
path: "data/climatebug-data-test.parquet"
- config_name: main
data_files: "data/climatebug-data.parquet"
extra_gated_prompt: >-
Welcome to the access request form for climateBUG-Data. This dataset is available under the CC BY-NC-ND 4.0 license with the additional condition that the dataset cannot be reshared in any form.
Interested in commercial licensing? Let us know here. We're open to explore how our dataset can support your projects.
extra_gated_fields:
"Name": text
"Email": text
"Affiliation": text
"Purpose of Use": text
"I acknowledge the non-commercial use terms of this dataset, unless a commercial license is granted": checkbox
extra_gated_heading: "Access Request for climateBUG-Data"
extra_gated_button_content: "Request Access"
---
# climateBUG-Data
## Overview
climateBUG-Data is a part of the climateBUG framework. It focuses on analyzing climate-related discussions in EU banks' reporting using computational linguistics.
## Key Features
- **Dataset Composition**: The dataset includes over 1.07 million annotated statements from EU banks' annual and sustainability reports, covering the years 2015 to 2020. It provides an analysis of climate change and finance topics discussed in the European banking sector during this period.
- **Integration with climateBUG Framework**: Designed to be utilized with the climateBUG framework's deep learning model and analytical tools.
## Access and Usage
- Models, dataset and tools are available at the [climateBUG project page](https://www.climatebug.se/).
- Suitable for researchers and professionals in finance, sustainability, and climate policy.
## Licensing and Availability
### Non-Commercial Research
- **License**: The climateBUG-Data, including its models and tools, is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0).
- **Additional Restriction**: Redistribution or sharing of the dataset in any form is not permitted. This additional restriction complements the standard CC BY-NC-ND 4.0 license terms.
- **License Details**: Please review the [CC BY-NC-ND 4.0 license](https://creativecommons.org/licenses/by-nc-nd/4.0/) for complete terms, keeping in mind our specific no-resharing clause.
### Commercial Use
- **Open to Collaboration**: We welcome interest from commercial entities and are open to exploring how the climateBUG-Data can contribute to your projects and initiatives.
- **Tailored Licensing Arrangements**: Understanding that commercial needs can vary, we are ready to discuss customized licensing arrangements that align with your specific requirements.
- **Contact Us**: To discuss potential collaborations and commercial licensing options, please reach out to us at climatebug@lumilogic.se.
## Citation
Please cite this dataset as follows:
Yu, Y., Scheidegger, S., Elliott, J., & Löfgren, Å. (2024). climateBUG: A data-driven framework for analyzing bank reporting through a climate lens. Expert Systems With Applications, 239, 122162.
```bibtex
@article{yu2024climatebug,
title = {climateBUG : A data-driven framework for analyzing bank reporting through a climate lens},
journal = {Expert Systems with Applications},
volume = {239},
pages = {122162},
year = {2024},
author = {Yinan Yu and Samuel Scheidegger and Jasmine Elliott and Åsa Löfgren}
}
```
## Support and Contact
For support, additional information, or inquiries, please reach out through climatebug@lumilogic.se or visit the [climateBUG project page](https://www.climatebug.se/).
|
aisuko/sentences_of_Melbourne | ---
license: apache-2.0
---
This dataset includes the embeddings of the 10 sentences which are describled about the Melbourne City's information. You can check these sentences in Kaggle notebook [Semantic Search](https://www.kaggle.com/code/aisuko/semantic-search).
# How to use it
```python
from datasets import load_dataset
embeddings_ds=load_dataset('aisuko/sentences_of_Melbourne')
``` |
open-llm-leaderboard/details_aqweteddy__mistral_tv-neural-marconroni | ---
pretty_name: Evaluation run of aqweteddy/mistral_tv-neural-marconroni
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [aqweteddy/mistral_tv-neural-marconroni](https://huggingface.co/aqweteddy/mistral_tv-neural-marconroni)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aqweteddy__mistral_tv-neural-marconroni\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-29T21:06:58.547736](https://huggingface.co/datasets/open-llm-leaderboard/details_aqweteddy__mistral_tv-neural-marconroni/blob/main/results_2023-12-29T21-06-58.547736.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.653997201380612,\n\
\ \"acc_stderr\": 0.031948482529961096,\n \"acc_norm\": 0.6549938668135185,\n\
\ \"acc_norm_stderr\": 0.032596331692297566,\n \"mc1\": 0.4418604651162791,\n\
\ \"mc1_stderr\": 0.017384767478986218,\n \"mc2\": 0.6002963889764419,\n\
\ \"mc2_stderr\": 0.015327998641933535\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6604095563139932,\n \"acc_stderr\": 0.013839039762820167,\n\
\ \"acc_norm\": 0.6919795221843004,\n \"acc_norm_stderr\": 0.013491429517292038\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6724756024696276,\n\
\ \"acc_stderr\": 0.004683511716552242,\n \"acc_norm\": 0.8625771758613822,\n\
\ \"acc_norm_stderr\": 0.003435895386692258\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.03639057569952928,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.03639057569952928\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778394,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778394\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6871794871794872,\n \"acc_stderr\": 0.02350757902064536,\n \
\ \"acc_norm\": 0.6871794871794872,\n \"acc_norm_stderr\": 0.02350757902064536\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.02983796238829194,\n \
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.02983796238829194\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374308,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374308\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553353,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553353\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265023,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265023\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4100558659217877,\n\
\ \"acc_stderr\": 0.016449708209026078,\n \"acc_norm\": 0.4100558659217877,\n\
\ \"acc_norm_stderr\": 0.016449708209026078\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292452,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292452\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n\
\ \"acc_stderr\": 0.012740853872949834,\n \"acc_norm\": 0.4661016949152542,\n\
\ \"acc_norm_stderr\": 0.012740853872949834\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6862745098039216,\n \"acc_stderr\": 0.01877168389352818,\n \
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.01877168389352818\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4418604651162791,\n\
\ \"mc1_stderr\": 0.017384767478986218,\n \"mc2\": 0.6002963889764419,\n\
\ \"mc2_stderr\": 0.015327998641933535\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8089976322020521,\n \"acc_stderr\": 0.011047808761510425\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6618650492797574,\n \
\ \"acc_stderr\": 0.013030829145172217\n }\n}\n```"
repo_url: https://huggingface.co/aqweteddy/mistral_tv-neural-marconroni
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|arc:challenge|25_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|gsm8k|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hellaswag|10_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T21-06-58.547736.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T21-06-58.547736.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- '**/details_harness|winogrande|5_2023-12-29T21-06-58.547736.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-29T21-06-58.547736.parquet'
- config_name: results
data_files:
- split: 2023_12_29T21_06_58.547736
path:
- results_2023-12-29T21-06-58.547736.parquet
- split: latest
path:
- results_2023-12-29T21-06-58.547736.parquet
---
# Dataset Card for Evaluation run of aqweteddy/mistral_tv-neural-marconroni
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [aqweteddy/mistral_tv-neural-marconroni](https://huggingface.co/aqweteddy/mistral_tv-neural-marconroni) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aqweteddy__mistral_tv-neural-marconroni",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T21:06:58.547736](https://huggingface.co/datasets/open-llm-leaderboard/details_aqweteddy__mistral_tv-neural-marconroni/blob/main/results_2023-12-29T21-06-58.547736.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.653997201380612,
"acc_stderr": 0.031948482529961096,
"acc_norm": 0.6549938668135185,
"acc_norm_stderr": 0.032596331692297566,
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986218,
"mc2": 0.6002963889764419,
"mc2_stderr": 0.015327998641933535
},
"harness|arc:challenge|25": {
"acc": 0.6604095563139932,
"acc_stderr": 0.013839039762820167,
"acc_norm": 0.6919795221843004,
"acc_norm_stderr": 0.013491429517292038
},
"harness|hellaswag|10": {
"acc": 0.6724756024696276,
"acc_stderr": 0.004683511716552242,
"acc_norm": 0.8625771758613822,
"acc_norm_stderr": 0.003435895386692258
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.03639057569952928,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.03639057569952928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778394,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6871794871794872,
"acc_stderr": 0.02350757902064536,
"acc_norm": 0.6871794871794872,
"acc_norm_stderr": 0.02350757902064536
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608301,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608301
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.02983796238829194,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.02983796238829194
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374308,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374308
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553353,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553353
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265023,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265023
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4100558659217877,
"acc_stderr": 0.016449708209026078,
"acc_norm": 0.4100558659217877,
"acc_norm_stderr": 0.016449708209026078
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292452,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292452
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.012740853872949834,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.012740853872949834
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.01877168389352818,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.01877168389352818
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986218,
"mc2": 0.6002963889764419,
"mc2_stderr": 0.015327998641933535
},
"harness|winogrande|5": {
"acc": 0.8089976322020521,
"acc_stderr": 0.011047808761510425
},
"harness|gsm8k|5": {
"acc": 0.6618650492797574,
"acc_stderr": 0.013030829145172217
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
huggingartists/kasta | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/kasta"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.992175 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/4fb42a447843eee46b0b77439ecd8fd2.800x800x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/kasta">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Каста (Kasta)</div>
<a href="https://genius.com/artists/kasta">
<div style="text-align: center; font-size: 14px;">@kasta</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/kasta).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/kasta")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|235| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/kasta")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
fathyshalab/MDCSI_mode-schmuck-zubehoer | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: label_name
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 149145
num_examples: 348
- name: test
num_bytes: 37617
num_examples: 88
download_size: 101309
dataset_size: 186762
---
# Dataset Card for "reklamation24_mode-schmuck-zubehoer-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
megantron/jarred_photos | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 20804095.0
num_examples: 19
download_size: 20781723
dataset_size: 20804095.0
---
# Dataset Card for "jarred_photos"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mnli_comparative_more_and | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 20865
num_examples: 90
- name: dev_mismatched
num_bytes: 24150
num_examples: 93
- name: test_matched
num_bytes: 23648
num_examples: 96
- name: test_mismatched
num_bytes: 32892
num_examples: 125
- name: train
num_bytes: 878395
num_examples: 3645
download_size: 566194
dataset_size: 979950
---
# Dataset Card for "MULTI_VALUE_mnli_comparative_more_and"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
quocanh34/thaivanlinhtest_new_cut_final | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: w2v2_transcription
dtype: string
- name: WER
dtype: int64
splits:
- name: train
num_bytes: 89394264.0
num_examples: 1077
download_size: 89354930
dataset_size: 89394264.0
---
# Dataset Card for "thaivanlinhtest_new_cut_final"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Valyusha/4chan-TTTT-2017 | ---
license: afl-3.0
---
|
tyzhu/synpre_set_1M_token_3000 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 1201540790
num_examples: 1000000
- name: validation
num_bytes: 12049203
num_examples: 10000
download_size: 733055816
dataset_size: 1213589993
---
# Dataset Card for "synpre_set_1M_token_3000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zkdeng/combinedSpidersBalanced | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Acantholycosa_lignaria
'1': Aculepeira_ceropegia
'2': Agalenatea_redii
'3': Agelena_labyrinthica
'4': Aglaoctenus_castaneus
'5': Aglaoctenus_lagotis
'6': Allocosa_funerea
'7': Allotrochosina_schauinslandi
'8': Alopecosa_albofasciata
'9': Alopecosa_barbipes
'10': Alopecosa_cuneata
'11': Alopecosa_inquilina
'12': Alopecosa_kochi
'13': Alopecosa_pulverulenta
'14': Anahita_punctulata
'15': Anasaitis_canosa
'16': Ancylometes_bogotensis
'17': Ancylometes_concolor
'18': Ancylometes_rufus
'19': Anoteropsis_hilaris
'20': Anoteropsis_litoralis
'21': Anyphaena_accentuata
'22': Aphonopelma_hentzi
'23': Araneus_diadematus
'24': Araneus_marmoreus
'25': Araneus_quadratus
'26': Araneus_trifolium
'27': Araniella_displicata
'28': Arctosa_cinerea
'29': Arctosa_leopardus
'30': Arctosa_littoralis
'31': Arctosa_perita
'32': Arctosa_personata
'33': Argiope_argentata
'34': Argiope_aurantia
'35': Argiope_bruennichi
'36': Argiope_keyserlingi
'37': Argiope_lobata
'38': Argiope_trifasciata
'39': Asthenoctenus_borellii
'40': Attulus_fasciger
'41': Aulonia_albimana
'42': Austracantha_minax
'43': Badumna_longinqua
'44': Carrhotus_xanthogramma
'45': Centroctenus_brevipes
'46': Cheiracanthium_erraticum
'47': Cheiracanthium_gracile
'48': Cheiracanthium_inclusum
'49': Cheiracanthium_mildei
'50': Cheiracanthium_punctorium
'51': Colonus_hesperus
'52': Colonus_sylvanus
'53': Ctenus_amphora
'54': Ctenus_hibernalis
'55': Ctenus_medius
'56': Ctenus_ornatus
'57': Cupiennius_coccineus
'58': Cupiennius_getazi
'59': Cupiennius_salei
'60': Cyclosa_conica
'61': Cyrtophora_citricola
'62': Diapontia_uruguayensis
'63': Dolomedes_albineus
'64': Dolomedes_minor
'65': Dolomedes_scriptus
'66': Dolomedes_tenebrosus
'67': Dolomedes_triton
'68': Dysdera_crocata
'69': Ebrechtella_tricuspidata
'70': Enoplognatha_ovata
'71': Eratigena_agrestis
'72': Eratigena_duellica
'73': Eriophora_ravilla
'74': Eris_militaris
'75': Evarcha_arcuata
'76': Gasteracantha_cancriformis
'77': Geolycosa_vultuosa
'78': Gladicosa_gulosa
'79': Gladicosa_pulchra
'80': Habronattus_pyrrithrix
'81': Hasarius_adansoni
'82': Helpis_minitabunda
'83': Hentzia_mitrata
'84': Hentzia_palmarum
'85': Herpyllus_ecclesiasticus
'86': Heteropoda_venatoria
'87': Hippasa_holmerae
'88': Hogna_antelucana
'89': Hogna_baltimoriana
'90': Hogna_bivittata
'91': Hogna_carolinensis
'92': Hogna_crispipes
'93': Hogna_frondicola
'94': Hogna_gumia
'95': Hogna_radiata
'96': Holocnemus_pluchei
'97': Kukulcania_hibernalis
'98': Lampona_cylindrata
'99': Larinioides_cornutus
'100': Larinioides_sclopetarius
'101': Latrodectus_bishopi
'102': Latrodectus_curacaviensis
'103': Latrodectus_geometricus
'104': Latrodectus_hasselti
'105': Latrodectus_hesperus
'106': Latrodectus_katipo
'107': Latrodectus_mactans
'108': Latrodectus_mirabilis
'109': Latrodectus_renivulvatus
'110': Latrodectus_tredecimguttatus
'111': Latrodectus_variolus
'112': Leucauge_argyra
'113': Leucauge_argyrobapta
'114': Leucauge_dromedaria
'115': Leucauge_venusta
'116': Loxosceles_amazonica
'117': Loxosceles_deserta
'118': Loxosceles_laeta
'119': Loxosceles_reclusa
'120': Loxosceles_rufescens
'121': Loxosceles_tenochtitlan
'122': Loxosceles_yucatana
'123': Lycosa_erythrognatha
'124': Lycosa_hispanica
'125': Lycosa_pampeana
'126': Lycosa_praegrandis
'127': Lycosa_singoriensis
'128': Lycosa_tarantula
'129': Lyssomanes_viridis
'130': Maevia_inclemens
'131': Mangora_acalypha
'132': Maratus_griseus
'133': Marpissa_muscosa
'134': Mecynogea_lemniscata
'135': Menemerus_bivittatus
'136': Menemerus_semilimbatus
'137': Micrathena_gracilis
'138': Micrathena_sagittata
'139': Micrommata_virescens
'140': Missulena_bradleyi
'141': Missulena_occatoria
'142': Misumena_vatia
'143': Misumenoides_formosipes
'144': Misumessus_oblongus
'145': Naphrys_pulex
'146': Neoscona_arabesca
'147': Neoscona_crucifera
'148': Neoscona_oaxacensis
'149': Nephila_pilipes
'150': Neriene_radiata
'151': Nesticodes_rufipes
'152': Nuctenea_umbratica
'153': Oxyopes_salticus
'154': Oxyopes_scalaris
'155': Paraphidippus_aurantius
'156': Parasteatoda_tepidariorum
'157': Paratrochosina_amica
'158': Pardosa_amentata
'159': Pardosa_lapidicina
'160': Pardosa_mercurialis
'161': Pardosa_moesta
'162': Pardosa_wagleri
'163': Peucetia_viridans
'164': Phidippus_audax
'165': Phidippus_clarus
'166': Phidippus_johnsoni
'167': Phidippus_putnami
'168': Philaeus_chrysops
'169': Philodromus_dispar
'170': Pholcus_phalangioides
'171': Phoneutria_boliviensis
'172': Phoneutria_depilata
'173': Phoneutria_fera
'174': Phoneutria_nigriventer
'175': Phoneutria_pertyi
'176': Phoneutria_reidyi
'177': Pirata_piraticus
'178': Pisaura_mirabilis
'179': Pisaurina_mira
'180': Platycryptus_californicus
'181': Platycryptus_undatus
'182': Plebs_eburnus
'183': Plexippus_paykulli
'184': Portacosa_cinerea
'185': Rabidosa_hentzi
'186': Rabidosa_punctulata
'187': Rabidosa_rabida
'188': Salticus_scenicus
'189': Sassacus_vitis
'190': Schizocosa_avida
'191': Schizocosa_malitiosa
'192': Schizocosa_mccooki
'193': Scytodes_thoracica
'194': Sicarius_thomisoides
'195': Socca_pustulosa
'196': Sosippus_californicus
'197': Steatoda_grossa
'198': Steatoda_nobilis
'199': Steatoda_triangulosa
'200': Synema_globosum
'201': Thomisus_onustus
'202': Tigrosa_annexa
'203': Tigrosa_aspersa
'204': Tigrosa_georgicola
'205': Tigrosa_helluo
'206': Trichonephila_clavata
'207': Trichonephila_clavipes
'208': Trichonephila_edulis
'209': Trichonephila_plumipes
'210': Trochosa_ruricola
'211': Trochosa_sepulchralis
'212': Trochosa_terricola
'213': Tropicosa_moesta
'214': Venator_immansuetus
'215': Venator_spenceri
'216': Venatrix_furcillata
'217': Verrucosa_arenata
'218': Wadicosa_fidelis
'219': Xerolycosa_miniata
'220': Xerolycosa_nemoralis
'221': Zoropsis_spinimana
'222': Zygiella_x-notata
splits:
- name: train
num_bytes: 6290926802.784
num_examples: 269172
download_size: 5634209308
dataset_size: 6290926802.784
---
# Dataset Card for "combinedSpidersBalanced"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ovior/twitter_dataset_1713212121 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2641324
num_examples: 8125
download_size: 1485829
dataset_size: 2641324
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joey234/medmcqa-original-neg-rule-neg-prepend | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: opa
dtype: string
- name: opb
dtype: string
- name: opc
dtype: string
- name: opd
dtype: string
- name: cop
dtype:
class_label:
names:
'0': a
'1': b
'2': c
'3': d
- name: choice_type
dtype: string
- name: exp
dtype: string
- name: subject_name
dtype: string
- name: topic_name
dtype: string
- name: neg_prompt
dtype: string
splits:
- name: validation
num_bytes: 544885
num_examples: 690
download_size: 354406
dataset_size: 544885
---
# Dataset Card for "medmcqa-original-neg-rule-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
helling100/avia_review_ru | ---
license: cc-by-nc-4.0
task_categories:
- text-classification
- feature-extraction
language:
- ru
size_categories:
- 10K<n<100K
--- |
mponty/web_crawl_docs | ---
dataset_info:
features:
- name: text
dtype: string
- name: lang
dtype: string
- name: source
dtype: string
- name: url
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 1157430740
num_examples: 87370
download_size: 492924255
dataset_size: 1157430740
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "web_crawl_docs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
22Plaruno/face_ds | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 10359951.0
num_examples: 100
download_size: 10362224
dataset_size: 10359951.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "face_ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rshrott/rates | ---
dataset_info:
features:
- name: Date
dtype: timestamp[ns]
- name: Rate
dtype: float64
splits:
- name: train
num_bytes: 3488
num_examples: 218
download_size: 4025
dataset_size: 3488
---
# Dataset Card for "rates"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HAERAE-HUB/csatqa | ---
dataset_info:
features:
- name: test_name
dtype: string
- name: question_number
dtype: int64
- name: context
dtype: string
- name: question
dtype: string
- name: gold
dtype: int64
- name: option#1
dtype: string
- name: option#2
dtype: string
- name: option#3
dtype: string
- name: option#4
dtype: string
- name: option#5
dtype: string
- name: Category
dtype: string
- name: Human_Peformance
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 4220807
num_examples: 936
download_size: 1076028
dataset_size: 4220807
task_categories:
- multiple-choice
language:
- ko
---
# Dataset Card for "CSAT-QA"
## Dataset Summary
The field of Korean Language Processing is experiencing a surge in interest,
illustrated by the introduction of open-source models such as Polyglot-Ko and proprietary models like HyperClova.
Yet, as the development of larger and superior language models accelerates, evaluation methods aren't keeping pace.
Recognizing this gap, we at HAE-RAE are dedicated to creating tailored benchmarks for the rigorous evaluation of these models.
CSAT-QA is a comprehensive collection of 936 multiple choice question answering (MCQA) questions,
manually collected the College Scholastic Ability Test (CSAT), a rigorous Korean University entrance exam.
The CSAT-QA is divided into two subsets: a complete version encompassing all 936 questions,
and a smaller, specialized version used for targeted evaluations.
The smaller subset further diversifies into six distinct categories:
Writing (WR), Grammar (GR), Reading Comprehension: Science (RCS), Reading Comprehension: Social Science (RCSS),
Reading Comprehension: Humanities (RCH), and Literature (LI). Moreover, the smaller subset includes the recorded accuracy of South Korean students,
providing a valuable real-world performance benchmark.
For a detailed explanation of how the CSAT-QA was created
please check out the [accompanying blog post](https://github.com/guijinSON/hae-rae/blob/main/blog/CSAT-QA.md),
and for evaluation check out [LM-Eval-Harness](https://github.com/EleutherAI/lm-evaluation-harness) on github.
## Evaluation Results
| **Models** | **GR** | **LI** | **RCH** | **RCS** | **RCSS** | **WR** | **Average** |
|:-----------------:|:---------:|:---------:|:---------:|:---------:|:---------:|:---------:|:-----------:|
| polyglot-ko-12.8B | 32.0 | 29.73 | 17.14| 10.81 | 21.43 | 18.18 | 21.55|
| gpt-3.5-wo-token | 16.0 | 32.43 | 42.86 | 18.92 | 35.71 | 0.00 | 24.32 |
| gpt-3.5-w-token | 16.0 | 35.14 | 42.86 | 18.92 | 35.71 | 9.09 | 26.29 |
| gpt-4-wo-token | 40.0 | 54.05 | **68.57** | **59.46** | **69.05** | 36.36 | **54.58** |
| gpt-4-w-token | 36.0 | **56.76** | **68.57** | **59.46** | **69.05** | 36.36 | 54.37 |
| Human Performance | **45.41** | 54.38 | 48.7 | 39.93 | 44.54 | **54.0** | 47.83 |
## How to Use
The CSAT-QA includes two subsets. The full version with 936 questions can be downloaded using the following code:
```
from datasets import load_dataset
dataset = load_dataset("EleutherAI/CSAT-QA", "full")
```
A more condensed version, which includes human accuracy data, can be downloaded using the following code:
```
from datasets import load_dataset
import pandas as pd
dataset = load_dataset("EleutherAI/CSAT-QA", "GR") # Choose from either WR, GR, LI, RCH, RCS, RCSS,
```
## Evaluate using LM-Eval-Harness
To evaluate your model simply by using the LM-Eval-Harness by EleutherAI follow the steps below.
1. To install lm-eval from the github repository main branch, run:
```
git clone https://github.com/EleutherAI/lm-evaluation-harness
cd lm-evaluation-harness
pip install -e .
```
2. To install additional multilingual tokenization and text segmentation packages, you must install the package with the multilingual extra:
```
pip install -e ".[multilingual]"
```
3. Run the evaluation by:
```
python main.py \
--model hf-causal \
--model_args pretrained=EleutherAI/polyglot-ko-1.3b \
--tasks csatqa_wr,csatqa_gr,csatqa_rcs,csatqa_rcss,csatqa_rch,csatqa_li \
--device cuda:0
```
## License
The copyright of this material belongs to the Korea Institute for Curriculum and Evaluation(한국교육과정평가원) and may be used for research purposes only.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
oserikov/arabic_billion_words | ---
annotations_creators:
- found
language_creators:
- found
language:
- ar
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
- 10K<n<100K
- 1M<n<10M
source_datasets:
- original
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
paperswithcode_id: null
pretty_name: Arabic Billion Words
dataset_info:
- config_name: Alittihad
features:
- name: url
dtype: string
- name: head_line
dtype: string
- name: date
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1601790302
num_examples: 349342
download_size: 348259999
dataset_size: 1601790302
- config_name: Almasryalyoum
features:
- name: url
dtype: string
- name: head_line
dtype: string
- name: date
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1056197870
num_examples: 291723
download_size: 242604438
dataset_size: 1056197870
- config_name: Almustaqbal
features:
- name: url
dtype: string
- name: head_line
dtype: string
- name: date
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1545659336
num_examples: 446873
download_size: 350826797
dataset_size: 1545659336
- config_name: Alqabas
features:
- name: url
dtype: string
- name: head_line
dtype: string
- name: date
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2631729746
num_examples: 817274
download_size: 595274646
dataset_size: 2631729746
- config_name: Echoroukonline
features:
- name: url
dtype: string
- name: head_line
dtype: string
- name: date
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 464386206
num_examples: 139732
download_size: 108184378
dataset_size: 464386206
- config_name: Ryiadh
features:
- name: url
dtype: string
- name: head_line
dtype: string
- name: date
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 3101294859
num_examples: 858188
download_size: 691264971
dataset_size: 3101294859
- config_name: Sabanews
features:
- name: url
dtype: string
- name: head_line
dtype: string
- name: date
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 198019614
num_examples: 92149
download_size: 38214558
dataset_size: 198019614
- config_name: SaudiYoum
features:
- name: url
dtype: string
- name: head_line
dtype: string
- name: date
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2723291416
num_examples: 888068
download_size: 605537923
dataset_size: 2723291416
- config_name: Techreen
features:
- name: url
dtype: string
- name: head_line
dtype: string
- name: date
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1103458209
num_examples: 314597
download_size: 252976781
dataset_size: 1103458209
- config_name: Youm7
features:
- name: url
dtype: string
- name: head_line
dtype: string
- name: date
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 3004689464
num_examples: 1172136
download_size: 617708074
dataset_size: 3004689464
config_names:
- Alittihad
- Almasryalyoum
- Almustaqbal
- Alqabas
- Echoroukonline
- Ryiadh
- Sabanews
- SaudiYoum
- Techreen
- Youm7
---
# Dataset Card for Arabic Billion Words Corpus
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://www.abuelkhair.net/index.php/en/arabic/abu-el-khair-corpus
- **Repository:**
- **Paper:** https://arxiv.org/pdf/1611.04033
- **Leaderboard:**
- **Point of Contact:**[Ibrahim Abu El-Khair](iabuelkhair@gmail.com)
### Dataset Summary
Abu El-Khair Corpus is an Arabic text corpus, that includes more than five million newspaper articles.
It contains over a billion and a half words in total, out of which, there are about three million unique words.
The corpus is encoded with two types of encoding, namely: UTF-8, and Windows CP-1256.
Also it was marked with two mark-up languages, namely: SGML, and XML.
**NB:** this dataset is based on the [unofficial copy](https://drive.google.com/drive/folders/1F2wCEfFHzJqX7eTuWhh-pGtrsaHPvTT8?usp=drive_link) ([discussion](https://huggingface.co/datasets/arabic_billion_words/discussions/3)) of the data, and assumes it was downloaded properly. Put the `new_data_*` files to the `./dataset` folder like this:
```
[user@machine /path/to/dataset]$ tree
.
├── arabic_billion_words.py
├── dataset
│ ├── new_data_Alittihad_XML_utf_8.rar
│ ├── new_data_Almasryalyoum_XML_utf_8.rar
│ ├── new_data_Almustaqbal_XML_utf_8.rar
│ ├── new_data_Alqabas_XML_utf_8.rar
│ ├── new_data_Echoroukonline_XML_utf_8.rar
│ ├── new_data_Ryiadh_XML_utf_8.rar
│ ├── new_data_Sabanews_XML_utf_8.rar
│ ├── new_data_SaudiYoum_XML_utf_8.rar
│ ├── new_data_Techreen_XML_utf_8.rar
│ └── new_data_Youm7_XML_utf_8.rar
├── dataset_infos.json
├── README.md
└── usage_example.py
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
Arabic
## Dataset Structure
### Data Instances
This is an example of the "Almasryalyoum" configuration subset:
```python
{
"url": "http://today.almasryalyoum.com/printerfriendly.aspx?ArticleID=61300",
"head_line": "رئيس وزراء المجر: عنصرية جماهير أوجبيست جلبت العار للبلاد",
"date": "19/5/2007",
"text": """قال متحدث باسم الحكومة المجرية: إن رئيس الوزراء فيرنك جيوركساني رحب بقرار اتحاد كرة القدم المجري بخصم ثلاث نقاط من نادي أوجبيست بسبب السلوك العنصري الذي صدر من جماهيره.
وعاقب الاتحاد المجري فريق أوجبيست بعد أن سخرت جماهيره من إبراهيم سيديبي مهاجم فريق ديبرينسين الأسود أثناء مباراة الفريقين أوائل مايو الجاري.
يذكر أن الاتحاد فرض أيضا غرامة مالية قدرها 20 ألف دولار علي أوجبيست في عام 2005 بعد أن رددت جماهيره شعارات معادية للسامية خلال مباراة بالدوري المجري.
وأوضح جيوركساني في خطاب إلي إيستفان كيستليكي رئيس الاتحاد المجري لكرة القدم، أن هذا السلوك العنصري من الجماهير «جلب العار لكرة القدم وللمجر». يذكر أن المجر بها مجموعة من مشجعي كرة القدم المشاغبين «الهوليجانز»، وشارك الكثير منهم في أعمال شغب معادية للحكومة في العام الماضي.""",
}
```
### Data Fields
The data fields are:
- "url": string, original url of the article,
- "head_line": string, headline of the article,
- "date": string, date of the article,
- "text": string, text content of the article,
### Data Splits
There is only one "training" split for all configuration subsets, containing the following number of examples:
| | Number of examples |
|:---------------|-------------------:|
| Alittihad | 349342 |
| Almasryalyoum | 291723 |
| Almustaqbal | 446873 |
| Alqabas | 817274 |
| Echoroukonline | 139732 |
| Ryiadh | 858188 |
| Sabanews | 92149 |
| SaudiYoum | 888068 |
| Techreen | 314597 |
| Youm7 | 1172136 |
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@article{el20161,
title={1.5 billion words arabic corpus},
author={El-Khair, Ibrahim Abu},
journal={arXiv preprint arXiv:1611.04033},
year={2016}
}
```
### Contributions
Thanks to [@zaidalyafeai](https://github.com/zaidalyafeai) and [@albertvillanova](https://github.com/albertvillanova) for adding this dataset. |
BAJIRAO/dataset | ---
license: afl-3.0
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.