datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
sartajbhuvaji/self-driving-GTA-V | ---
license: mit
task_categories:
- image-classification
tags:
- self driving
- GTA
- GTA V
- driving
size_categories:
- 1M<n<10M
source_datasets:
- original
configs:
- config_name: default
data_files:
- split: mini
path: training_data_count_mini.csv
- split: TrainingData_1
path: training_data_count_001-100.csv
- split: TrainingData_2
path: training_data_count_101-200.csv
---
# Self Driving GTA V Dataset

# Dataset Varients
- Mini : [Link](https://huggingface.co/datasets/sartajbhuvaji/self-driving-GTA-V/tree/main/mini)
- Training Data(1-100) : [Link](https://huggingface.co/datasets/sartajbhuvaji/self-driving-GTA-V/tree/main/Training%20Data(1-100))
- Training Data(101-200) : [Link](https://huggingface.co/datasets/sartajbhuvaji/self-driving-GTA-V/tree/main/Training%20Data(101-200))
### Info
- Image Resolution : 270, 480
- Mode : RGB
- Dimension : (270, 480, 3)
- File Count : 100
- Size : 1.81 GB/file
- Total Data Size : 362 GB
- Total Frames : 1 Million
### Data Set sizes
#### Mini :
- Folder Name : mini
- Files : 01
- Total Size : 1.81 GB
- Total Frames : 5000
#### First Half
- Folder Name : Training Data(1-100)
- Files : 100
- Total Size : 181 GB
- Total Frames : 500,000
#### Second Half
- Folder Name : Training Data(101-200)
- Files : 100
- Total Size : 181 GB
- Total Frames : 500,000
### Data Count
#### Mini
```
'W': [1, 0, 0, 0, 0, 0, 0, 0, 0] : 3627
'S': [0, 1, 0, 0, 0, 0, 0, 0, 0] : 50
'A': [0, 0, 1, 0, 0, 0, 0, 0, 0] : 104
'D': [0, 0, 0, 1, 0, 0, 0, 0, 0] : 106
'WA': [0, 0, 0, 0, 1, 0, 0, 0, 0] : 364
'WD': [0, 0, 0, 0, 0, 1, 0, 0, 0] : 416
'SA': [0, 0, 0, 0, 0, 0, 1, 0, 0] : 35
'SD': [0, 0, 0, 0, 0, 0, 0, 1, 0] : 47
'NK': [0, 0, 0, 0, 0, 0, 0, 0, 1] : 248
NONE : 3
```
#### First Half (Data Count (1-100))
```
'W': [1, 0, 0, 0, 0, 0, 0, 0, 0] : 353725
'S': [0, 1, 0, 0, 0, 0, 0, 0, 0] : 2243
'A': [0, 0, 1, 0, 0, 0, 0, 0, 0] : 14303
'D': [0, 0, 0, 1, 0, 0, 0, 0, 0] : 13114
'WA': [0, 0, 0, 0, 1, 0, 0, 0, 0] : 30877
'WD': [0, 0, 0, 0, 0, 1, 0, 0, 0] : 29837
'SA': [0, 0, 0, 0, 0, 0, 1, 0, 0] : 1952
'SD': [0, 0, 0, 0, 0, 0, 0, 1, 0] : 1451
'NK': [0, 0, 0, 0, 0, 0, 0, 0, 1] : 52256
NONE : 242
```
#### Second Half (Data Count (101-200))
```
'W': [1, 0, 0, 0, 0, 0, 0, 0, 0] : 359025
'S': [0, 1, 0, 0, 0, 0, 0, 0, 0] : 2834
'A': [0, 0, 1, 0, 0, 0, 0, 0, 0] : 11025
'D': [0, 0, 0, 1, 0, 0, 0, 0, 0] : 9639
'WA': [0, 0, 0, 0, 1, 0, 0, 0, 0] : 31896
'WD': [0, 0, 0, 0, 0, 1, 0, 0, 0] : 29756
'SA': [0, 0, 0, 0, 0, 0, 1, 0, 0] : 1742
'SD': [0, 0, 0, 0, 0, 0, 0, 1, 0] : 2461
'NK': [0, 0, 0, 0, 0, 0, 0, 0, 1] : 51313
NONE : 309
```
### Graphics Details
- Original Resolution : 800 x 600
- Aspect Ratio : 16:10
- All Video Settings : Low
### Camera Details
- Camera : Hood Cam
- Vehical Camera Height : Low
- First Person Vehical Auto-Center : On
- First Person Head Bobbing : Off
### Other Details
- Vehical : Michael's Car
- Vehical Mods : All Max
- Cv2 Mask : None
- Way Point : Enabled/Following
- Weather Conditions : Mostly Sunny
- Time of Day : Day, Night
- Rain : Some
### Note
- Remove `NONE` while processing the data
- Use the `mini` dataset for initial setup and testing
- Check `training_data_count_001-100.csv` & `training_data_count_101-200.csv` for detailed count
- Check `training_data_stats.py` for more info
### Inspired From
- Sentdex
- [Youtube: Python Plays: Grand Theft Auto V](https://youtube.com/playlist?list=PLQVvvaa0QuDeETZEOy4VdocT7TOjfSA8a&si=M5Pt-O97yvWgZMQE) |
nyuuzyou/wb-products | ---
annotations_creators:
- crowdsourced
language:
- ru
language_creators:
- crowdsourced
license:
- cc0-1.0
multilinguality:
- monolingual
pretty_name: Wildberries products
size_categories:
- 100M<n<1B
source_datasets:
- original
task_categories:
- text-generation
task_ids:
- language-modeling
---
# Dataset Card for Wildberries products
### Dataset Summary
This dataset was scraped from product pages on the Russian marketplace [Wildberries](https://www.wildberries.ru). It includes all information from the product card and metadata from the API, excluding image URLs. The dataset was collected by processing approximately 160 million products out of a potential 230 million, starting from the first product. Data collection had to be stopped due to serious rate limits that prevented further progress. The data is in zstd archives containing jsonl files. Each archive contains data from a specific Wildberries data server identified by a basket server number.
### Languages
The dataset is mostly in Russian, but there may be other languages present.
## Dataset Structure
### Data Fields
This dataset includes the following fields:
- `imt_id`: Identifier for the item (integer)
- `nm_id`: Numeric identifier associated with the item (integer)
- `imt_name`: Name of the product (string)
- `subj_name`: Subject name (string)
- `subj_root_name`: Root subject name (string)
- `nm_colors_names`: Colors names (string, may be empty)
- `vendor_code`: Vendor code (string)
- `description`: Description of the product (string, may be empty)
- `brand_name`: Name of the brand (string)
### Data Splits
All examples are in the train split, there is no validation split.
## Additional Information
### License
This dataset is dedicated to the public domain under the Creative Commons Zero (CC0) license. This means you can:
* Use it for any purpose, including commercial projects.
* Modify it however you like.
* Distribute it without asking permission.
No attribution is required, but it's always appreciated!
CC0 license: https://creativecommons.org/publicdomain/zero/1.0/deed.en
To learn more about CC0, visit the Creative Commons website: https://creativecommons.org/publicdomain/zero/1.0/
### Dataset Curators
- [nyuuzyou](https://ducks.party)
|
davide221/verilog-instruct-deepseek-60k | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 118916285
num_examples: 60199
download_size: 37374425
dataset_size: 118916285
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
michelcarroll/llama2-earnings-stock-prediction-fine-tune-v2 | ---
dataset_info:
features:
- name: completion
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 87920323
num_examples: 111140
- name: development
num_bytes: 26603449
num_examples: 33284
- name: test
num_bytes: 840735
num_examples: 1000
download_size: 47167270
dataset_size: 115364507
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: development
path: data/development-*
- split: test
path: data/test-*
---
|
Multimodal-Fatima/VQAv2_minival_validation | ---
dataset_info:
features:
- name: question_type
dtype: string
- name: multiple_choice_answer
dtype: string
- name: answers
sequence: string
- name: answers_original
list:
- name: answer
dtype: string
- name: answer_confidence
dtype: string
- name: answer_id
dtype: int64
- name: id_image
dtype: int64
- name: answer_type
dtype: string
- name: question_id
dtype: int64
- name: question
dtype: string
- name: image
dtype: image
- name: clip_tags_ViT_L_14
sequence: string
- name: blip_caption
dtype: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes
list:
- name: attribute
dtype: string
- name: box
sequence: float32
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float32
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14_blip_caption
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: caption
dtype: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: id
dtype: int64
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14_blip_caption_caption_module
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: caption
dtype: string
- name: captions_module
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14_blip_caption_caption_module_without_filtering
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: caption
dtype: string
- name: captions_module
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14_blip_caption_caption_module_random
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: caption
dtype: string
- name: captions_module
sequence: string
- name: captions_module_filter
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: clip_tags_LAION_ViT_H_14_2B
sequence: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_LAION-ViT-H-14-2B
sequence: string
- name: Attributes_ViT_L_14_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_ViT_L_14_wo_openai
sequence: string
- name: clip_tags_ViT_L_14_with_openai
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_wo_openai
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_with_openai
sequence: string
- name: clip_tags_LAION_ViT_bigG_14_2B_wo_openai
sequence: string
- name: clip_tags_LAION_ViT_bigG_14_2B_with_openai
sequence: string
- name: Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: Attributes_LAION_ViT_bigG_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_ViT_B_16_with_openai
sequence: string
- name: blip_caption_beam_5_Salesforce_blip2_flan_t5_xxl
dtype: string
- name: DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large_
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: captions_all_patches
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large_clean
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: captions_all_patches
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: blip_caption_Salesforce_blip_image_captioning_large_intensive
sequence: string
- name: blip_caption_Salesforce_blip_image_captioning_base_intensive
sequence: string
splits:
- name: validation
num_bytes: 10757838822.0
num_examples: 25994
download_size: 2788131849
dataset_size: 10757838822.0
---
# Dataset Card for "VQAv2_minival_validation_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
codeparrot/github-jupyter-parsed | ---
annotations_creators: []
language_creators:
- crowdsourced
- expert-generated
language:
- code
license:
- other
multilinguality:
- muonolingual
size_categories:
- unknown
source_datasets: []
task_categories:
- text-generation
task_ids:
- language-modeling
---
# GitHub Jupyter Dataset
## Dataset Description
This is a parsed and preprocessed version of [GitHub-Jupyter Dataset](https://huggingface.co/datasets/codeparrot/github-jupyter), a dataset extracted from Jupyter Notebooks on BigQuery. We only keep markdown and python cells and convert the markdown to text. Some heuristics are also applied to filter notebooks with little data and very long or very short cells.
## Licenses
Each example has the license of its associated repository. There are in total 15 licenses:
```python
[
'mit',
'apache-2.0',
'gpl-3.0',
'gpl-2.0',
'bsd-3-clause',
'agpl-3.0',
'lgpl-3.0',
'lgpl-2.1',
'bsd-2-clause',
'cc0-1.0',
'epl-1.0',
'mpl-2.0',
'unlicense',
'isc',
'artistic-2.0'
]
```
|
open-llm-leaderboard/details_vitruv__vitruv_1 | ---
pretty_name: Evaluation run of vitruv/vitruv_1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [vitruv/vitruv_1](https://huggingface.co/vitruv/vitruv_1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vitruv__vitruv_1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-06T02:40:06.932046](https://huggingface.co/datasets/open-llm-leaderboard/details_vitruv__vitruv_1/blob/main/results_2024-03-06T02-40-06.932046.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48098814582864774,\n\
\ \"acc_stderr\": 0.034444629168415404,\n \"acc_norm\": 0.48708235528126465,\n\
\ \"acc_norm_stderr\": 0.03523822694795442,\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.01525011707915648,\n \"mc2\": 0.4122590137037498,\n\
\ \"mc2_stderr\": 0.014277193708018924\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4726962457337884,\n \"acc_stderr\": 0.014589589101985994,\n\
\ \"acc_norm\": 0.4991467576791809,\n \"acc_norm_stderr\": 0.014611369529813276\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5605457080262896,\n\
\ \"acc_stderr\": 0.0049530634047914536,\n \"acc_norm\": 0.7605058753236407,\n\
\ \"acc_norm_stderr\": 0.004259025448541507\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.039993097127774734,\n\
\ \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.039993097127774734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.539622641509434,\n \"acc_stderr\": 0.030676096599389177,\n\
\ \"acc_norm\": 0.539622641509434,\n \"acc_norm_stderr\": 0.030676096599389177\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.5416666666666666,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n\
\ \"acc_stderr\": 0.03807301726504511,\n \"acc_norm\": 0.47398843930635837,\n\
\ \"acc_norm_stderr\": 0.03807301726504511\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596437,\n \"\
acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596437\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5774193548387097,\n\
\ \"acc_stderr\": 0.02810096472427264,\n \"acc_norm\": 0.5774193548387097,\n\
\ \"acc_norm_stderr\": 0.02810096472427264\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03344283744280458,\n\
\ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03344283744280458\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.03793713171165634,\n\
\ \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.03793713171165634\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.033586181457325226,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033586181457325226\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6632124352331606,\n \"acc_stderr\": 0.03410780251836184,\n\
\ \"acc_norm\": 0.6632124352331606,\n \"acc_norm_stderr\": 0.03410780251836184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.43333333333333335,\n \"acc_stderr\": 0.025124653525885117,\n\
\ \"acc_norm\": 0.43333333333333335,\n \"acc_norm_stderr\": 0.025124653525885117\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823018,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823018\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4327731092436975,\n \"acc_stderr\": 0.03218358107742613,\n \
\ \"acc_norm\": 0.4327731092436975,\n \"acc_norm_stderr\": 0.03218358107742613\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6091743119266055,\n \"acc_stderr\": 0.02092005834611106,\n \"\
acc_norm\": 0.6091743119266055,\n \"acc_norm_stderr\": 0.02092005834611106\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3611111111111111,\n \"acc_stderr\": 0.032757734861009996,\n \"\
acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.032757734861009996\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6127450980392157,\n \"acc_stderr\": 0.03418931233833344,\n \"\
acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.03418931233833344\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6329113924050633,\n \"acc_stderr\": 0.031376240725616185,\n \
\ \"acc_norm\": 0.6329113924050633,\n \"acc_norm_stderr\": 0.031376240725616185\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n\
\ \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.5874439461883408,\n\
\ \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4732824427480916,\n \"acc_stderr\": 0.04379024936553894,\n\
\ \"acc_norm\": 0.4732824427480916,\n \"acc_norm_stderr\": 0.04379024936553894\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.044492703500683836,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.044492703500683836\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.50920245398773,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.50920245398773,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.04689765937278135,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.04689765937278135\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7692307692307693,\n\
\ \"acc_stderr\": 0.027601921381417593,\n \"acc_norm\": 0.7692307692307693,\n\
\ \"acc_norm_stderr\": 0.027601921381417593\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6883780332056194,\n\
\ \"acc_stderr\": 0.016562433867284176,\n \"acc_norm\": 0.6883780332056194,\n\
\ \"acc_norm_stderr\": 0.016562433867284176\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.48554913294797686,\n \"acc_stderr\": 0.026907849856282542,\n\
\ \"acc_norm\": 0.48554913294797686,\n \"acc_norm_stderr\": 0.026907849856282542\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25921787709497207,\n\
\ \"acc_stderr\": 0.014655780837497731,\n \"acc_norm\": 0.25921787709497207,\n\
\ \"acc_norm_stderr\": 0.014655780837497731\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5032679738562091,\n \"acc_stderr\": 0.028629305194003543,\n\
\ \"acc_norm\": 0.5032679738562091,\n \"acc_norm_stderr\": 0.028629305194003543\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5819935691318328,\n\
\ \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.5819935691318328,\n\
\ \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5432098765432098,\n \"acc_stderr\": 0.027716661650194038,\n\
\ \"acc_norm\": 0.5432098765432098,\n \"acc_norm_stderr\": 0.027716661650194038\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251455,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251455\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34419817470664926,\n\
\ \"acc_stderr\": 0.01213443374100257,\n \"acc_norm\": 0.34419817470664926,\n\
\ \"acc_norm_stderr\": 0.01213443374100257\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.02952009569768776,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.02952009569768776\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.46078431372549017,\n \"acc_stderr\": 0.02016552331390791,\n \
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.02016552331390791\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5387755102040817,\n \"acc_stderr\": 0.03191282052669278,\n\
\ \"acc_norm\": 0.5387755102040817,\n \"acc_norm_stderr\": 0.03191282052669278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6069651741293532,\n\
\ \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.6069651741293532,\n\
\ \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6549707602339181,\n \"acc_stderr\": 0.036459813773888065,\n\
\ \"acc_norm\": 0.6549707602339181,\n \"acc_norm_stderr\": 0.036459813773888065\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25458996328029376,\n\
\ \"mc1_stderr\": 0.01525011707915648,\n \"mc2\": 0.4122590137037498,\n\
\ \"mc2_stderr\": 0.014277193708018924\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7158642462509865,\n \"acc_stderr\": 0.012675392786772727\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11296436694465505,\n \
\ \"acc_stderr\": 0.008719339028833067\n }\n}\n```"
repo_url: https://huggingface.co/vitruv/vitruv_1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|arc:challenge|25_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|gsm8k|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hellaswag|10_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-06T02-40-06.932046.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-06T02-40-06.932046.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- '**/details_harness|winogrande|5_2024-03-06T02-40-06.932046.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-06T02-40-06.932046.parquet'
- config_name: results
data_files:
- split: 2024_03_06T02_40_06.932046
path:
- results_2024-03-06T02-40-06.932046.parquet
- split: latest
path:
- results_2024-03-06T02-40-06.932046.parquet
---
# Dataset Card for Evaluation run of vitruv/vitruv_1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vitruv/vitruv_1](https://huggingface.co/vitruv/vitruv_1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vitruv__vitruv_1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-06T02:40:06.932046](https://huggingface.co/datasets/open-llm-leaderboard/details_vitruv__vitruv_1/blob/main/results_2024-03-06T02-40-06.932046.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48098814582864774,
"acc_stderr": 0.034444629168415404,
"acc_norm": 0.48708235528126465,
"acc_norm_stderr": 0.03523822694795442,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.01525011707915648,
"mc2": 0.4122590137037498,
"mc2_stderr": 0.014277193708018924
},
"harness|arc:challenge|25": {
"acc": 0.4726962457337884,
"acc_stderr": 0.014589589101985994,
"acc_norm": 0.4991467576791809,
"acc_norm_stderr": 0.014611369529813276
},
"harness|hellaswag|10": {
"acc": 0.5605457080262896,
"acc_stderr": 0.0049530634047914536,
"acc_norm": 0.7605058753236407,
"acc_norm_stderr": 0.004259025448541507
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40789473684210525,
"acc_stderr": 0.039993097127774734,
"acc_norm": 0.40789473684210525,
"acc_norm_stderr": 0.039993097127774734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.539622641509434,
"acc_stderr": 0.030676096599389177,
"acc_norm": 0.539622641509434,
"acc_norm_stderr": 0.030676096599389177
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.03807301726504511,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.03807301726504511
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596437,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596437
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471255,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5774193548387097,
"acc_stderr": 0.02810096472427264,
"acc_norm": 0.5774193548387097,
"acc_norm_stderr": 0.02810096472427264
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.03793713171165634,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.03793713171165634
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033586181457325226,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033586181457325226
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6632124352331606,
"acc_stderr": 0.03410780251836184,
"acc_norm": 0.6632124352331606,
"acc_norm_stderr": 0.03410780251836184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.43333333333333335,
"acc_stderr": 0.025124653525885117,
"acc_norm": 0.43333333333333335,
"acc_norm_stderr": 0.025124653525885117
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823018,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4327731092436975,
"acc_stderr": 0.03218358107742613,
"acc_norm": 0.4327731092436975,
"acc_norm_stderr": 0.03218358107742613
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6091743119266055,
"acc_stderr": 0.02092005834611106,
"acc_norm": 0.6091743119266055,
"acc_norm_stderr": 0.02092005834611106
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.032757734861009996,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.032757734861009996
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6127450980392157,
"acc_stderr": 0.03418931233833344,
"acc_norm": 0.6127450980392157,
"acc_norm_stderr": 0.03418931233833344
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6329113924050633,
"acc_stderr": 0.031376240725616185,
"acc_norm": 0.6329113924050633,
"acc_norm_stderr": 0.031376240725616185
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5874439461883408,
"acc_stderr": 0.03304062175449297,
"acc_norm": 0.5874439461883408,
"acc_norm_stderr": 0.03304062175449297
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4732824427480916,
"acc_stderr": 0.04379024936553894,
"acc_norm": 0.4732824427480916,
"acc_norm_stderr": 0.04379024936553894
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.044492703500683836,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.044492703500683836
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.50920245398773,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.50920245398773,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.04689765937278135,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.04689765937278135
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7692307692307693,
"acc_stderr": 0.027601921381417593,
"acc_norm": 0.7692307692307693,
"acc_norm_stderr": 0.027601921381417593
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6883780332056194,
"acc_stderr": 0.016562433867284176,
"acc_norm": 0.6883780332056194,
"acc_norm_stderr": 0.016562433867284176
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.026907849856282542,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.026907849856282542
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25921787709497207,
"acc_stderr": 0.014655780837497731,
"acc_norm": 0.25921787709497207,
"acc_norm_stderr": 0.014655780837497731
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5032679738562091,
"acc_stderr": 0.028629305194003543,
"acc_norm": 0.5032679738562091,
"acc_norm_stderr": 0.028629305194003543
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5819935691318328,
"acc_stderr": 0.028013651891995072,
"acc_norm": 0.5819935691318328,
"acc_norm_stderr": 0.028013651891995072
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5432098765432098,
"acc_stderr": 0.027716661650194038,
"acc_norm": 0.5432098765432098,
"acc_norm_stderr": 0.027716661650194038
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.028838921471251455,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.028838921471251455
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34419817470664926,
"acc_stderr": 0.01213443374100257,
"acc_norm": 0.34419817470664926,
"acc_norm_stderr": 0.01213443374100257
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.02952009569768776,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.02952009569768776
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.02016552331390791,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.02016552331390791
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5387755102040817,
"acc_stderr": 0.03191282052669278,
"acc_norm": 0.5387755102040817,
"acc_norm_stderr": 0.03191282052669278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6069651741293532,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.6069651741293532,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6549707602339181,
"acc_stderr": 0.036459813773888065,
"acc_norm": 0.6549707602339181,
"acc_norm_stderr": 0.036459813773888065
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.01525011707915648,
"mc2": 0.4122590137037498,
"mc2_stderr": 0.014277193708018924
},
"harness|winogrande|5": {
"acc": 0.7158642462509865,
"acc_stderr": 0.012675392786772727
},
"harness|gsm8k|5": {
"acc": 0.11296436694465505,
"acc_stderr": 0.008719339028833067
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mtc/full_cleaned_xsum_faith | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: document
dtype: string
- name: claim
dtype: string
- name: bbcid
dtype: string
- name: model_name
dtype: string
- name: label
dtype: string
- name: split
dtype: string
- name: annotations
sequence: string
splits:
- name: test
num_bytes: 3097533.036
num_examples: 1247
- name: train
num_bytes: 2639459.862857143
num_examples: 1048
- name: validation
num_bytes: 451054.0
num_examples: 200
download_size: 2120822
dataset_size: 6188046.898857143
---
# Dataset Card for "full_cleaned_xsum_faith"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
owanr/o1o2o3_xl_r2_iterater_with_human_pref_practice | ---
dataset_info:
features:
- name: src
dtype: string
- name: tgt
dtype: string
splits:
- name: train
num_bytes: 13302090
num_examples: 35644
- name: val
num_bytes: 649176
num_examples: 1692
- name: test
num_bytes: 666158
num_examples: 1707
download_size: 2420178
dataset_size: 14617424
---
# Dataset Card for "o1o2o3_xl_r2_iterater_with_human_pref_practice"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EmileEsmaili/sheet_music_ede2110 | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 2229356112.491
num_examples: 9219
download_size: 1211789844
dataset_size: 2229356112.491
---
# Dataset Card for "sheet_music_ede2110"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TobiasRobotics/brisbane-event-vpr | ---
license: cc-by-nc-sa-4.0
tags:
- computer vision
- robotics
- event cameras
pretty_name: Brisbane Event VPR
arxiv: 2006.02826
---
This dataset accompanies the following publication, please cite this publication if you use this dataset:
Fischer, T. and Milford, M., 2020. Event-Based Visual Place Recognition With Ensembles of Temporal Windows. IEEE Robotics and Automation Letters, 5(4), pp.6924-6931.
```bibtex
@article{fischer2020event,
title={Event-Based Visual Place Recognition With Ensembles of Temporal Windows},
author={Fischer, Tobias and Milford, Michael},
journal={IEEE Robotics and Automation Letters},
volume={5},
number={4},
pages={6924--6931},
year={2020}
}
```
The dataset contains five sequences of recordings. For each recording, a denoised `parquet` file is made available.
The source files for these `parquet` files can be found on [Zenodo](https://zenodo.org/records/4302805).
We also provide associated GPS information (`*.nmea`) files recorded using the consumer camera.
Please see the [associated code repository](https://github.com/Tobias-Fischer/sparse-event-vpr) for more information. |
CyberHarem/cygnet_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of cygnet/シグニット/小天鹅 (Azur Lane)
This is the dataset of cygnet/シグニット/小天鹅 (Azur Lane), containing 283 images and their tags.
The core tags of this character are `long_hair, breasts, large_breasts, white_hair, hair_bun, red_eyes, double_bun, braid, ribbon, bangs, hat, bow, ahoge, purple_eyes, hair_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 283 | 396.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cygnet_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 283 | 221.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cygnet_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 719 | 492.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cygnet_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 283 | 348.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cygnet_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 719 | 702.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cygnet_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/cygnet_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, blush, cleavage, looking_at_viewer, plaid_bikini, solo, straw_hat, hair_ornament, flower, navel, purple_bikini, innertube, sitting, official_alternate_costume, outdoors, smile |
| 1 | 5 |  |  |  |  |  | 1girl, black_thighhighs, blush, garter_straps, long_sleeves, shirt, solo, bursting_breasts, looking_at_viewer, retrofit_(azur_lane), button_gap, cleavage, open_mouth, purple_skirt, white_background, choker, plaid, simple_background |
| 2 | 8 |  |  |  |  |  | 1girl, hair_ornament, plaid, solo, choker, garter_straps, looking_at_viewer, purple_skirt, short_sleeves, blush, simple_background, black_thighhighs, purple_necktie, white_background, open_mouth, retrofit_(azur_lane), white_shirt |
| 3 | 6 |  |  |  |  |  | 1girl, blush, bursting_breasts, cleavage, collared_shirt, solo, white_shirt, braided_bun, button_gap, light_purple_hair, long_sleeves, purple_skirt, retrofit_(azur_lane), taut_shirt, looking_at_viewer, plaid, simple_background, upper_body, very_long_hair, huge_breasts, twitter_username, undersized_clothes, white_background |
| 4 | 43 |  |  |  |  |  | 1girl, blush, maid_headdress, solo, looking_at_viewer, cleavage, detached_collar, bare_shoulders, blue_dress, white_thighhighs, white_apron, wrist_cuffs, garter_straps, frilled_apron, waist_apron, hair_bow, blue_bow, collarbone, frilled_dress, very_long_hair, alternate_costume, braided_bun, white_collar, holding, strapless |
| 5 | 17 |  |  |  |  |  | 1girl, blush, christmas, garter_straps, looking_at_viewer, red_gloves, santa_costume, solo, white_thighhighs, capelet, skirt, bell, fur_trim, hair_ornament, navel, cleavage, midriff, light_purple_hair, gift_box |
| 6 | 48 |  |  |  |  |  | 1girl, cheerleader, navel, elbow_gloves, white_gloves, blush, solo, looking_at_viewer, pom_pom_(cheerleading), purple_skirt, midriff, open_mouth, bare_shoulders, white_belt, white_thighhighs, miniskirt, sleeveless_shirt, sweat, yellow_ribbon, pleated_skirt, white_shirt, holding, whistle_around_neck, collared_shirt, black_choker, zettai_ryouiki, crop_top |
| 7 | 9 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, choker, holding, flower, hairclip, pink_kimono, smile, obi, umbrella, wide_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | cleavage | looking_at_viewer | plaid_bikini | solo | straw_hat | hair_ornament | flower | navel | purple_bikini | innertube | sitting | official_alternate_costume | outdoors | smile | black_thighhighs | garter_straps | long_sleeves | shirt | bursting_breasts | retrofit_(azur_lane) | button_gap | open_mouth | purple_skirt | white_background | choker | plaid | simple_background | short_sleeves | purple_necktie | white_shirt | collared_shirt | braided_bun | light_purple_hair | taut_shirt | upper_body | very_long_hair | huge_breasts | twitter_username | undersized_clothes | maid_headdress | detached_collar | bare_shoulders | blue_dress | white_thighhighs | white_apron | wrist_cuffs | frilled_apron | waist_apron | hair_bow | blue_bow | collarbone | frilled_dress | alternate_costume | white_collar | holding | strapless | christmas | red_gloves | santa_costume | capelet | skirt | bell | fur_trim | midriff | gift_box | cheerleader | elbow_gloves | white_gloves | pom_pom_(cheerleading) | white_belt | miniskirt | sleeveless_shirt | sweat | yellow_ribbon | pleated_skirt | whistle_around_neck | black_choker | zettai_ryouiki | crop_top | hairclip | pink_kimono | obi | umbrella | wide_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-----------|:--------------------|:---------------|:-------|:------------|:----------------|:---------|:--------|:----------------|:------------|:----------|:-----------------------------|:-----------|:--------|:-------------------|:----------------|:---------------|:--------|:-------------------|:-----------------------|:-------------|:-------------|:---------------|:-------------------|:---------|:--------|:--------------------|:----------------|:-----------------|:--------------|:-----------------|:--------------|:--------------------|:-------------|:-------------|:-----------------|:---------------|:-------------------|:---------------------|:-----------------|:------------------|:-----------------|:-------------|:-------------------|:--------------|:--------------|:----------------|:--------------|:-----------|:-----------|:-------------|:----------------|:--------------------|:---------------|:----------|:------------|:------------|:-------------|:----------------|:----------|:--------|:-------|:-----------|:----------|:-----------|:--------------|:---------------|:---------------|:-------------------------|:-------------|:------------|:-------------------|:--------|:----------------|:----------------|:----------------------|:---------------|:-----------------|:-----------|:-----------|:--------------|:------|:-----------|:---------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | | X | | X | | X | | | | | | | | | X | X | | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | | X | | X | X | X | | X | X | | X | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 43 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 17 |  |  |  |  |  | X | X | X | X | | X | | X | | X | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 6 | 48 |  |  |  |  |  | X | X | | X | | X | | | | X | | | | | | | | | | | | | | X | X | | | | | | | X | X | | | | | | | | | | | X | | X | | | | | | | | | | | X | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 7 | 9 |  |  |  |  |  | X | X | | X | | X | | | X | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X |
|
bjoernp/gaps_spa | ---
dataset_info:
features:
- name: sentences
dtype: string
- name: sentences_sp
dtype: string
splits:
- name: train
num_bytes: 59056357510
num_examples: 231500660
download_size: 34172826813
dataset_size: 59056357510
---
# Dataset Card for "gaps_spa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FUBUKIBG/rosepronto1 | ---
license: openrail
---
|
open-llm-leaderboard/details_GritLM__GritLM-8x7B | ---
pretty_name: Evaluation run of GritLM/GritLM-8x7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [GritLM/GritLM-8x7B](https://huggingface.co/GritLM/GritLM-8x7B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_GritLM__GritLM-8x7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T14:24:41.959625](https://huggingface.co/datasets/open-llm-leaderboard/details_GritLM__GritLM-8x7B/blob/main/results_2024-03-10T14-24-41.959625.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7125539724588248,\n\
\ \"acc_stderr\": 0.0303257615066163,\n \"acc_norm\": 0.7161704881482764,\n\
\ \"acc_norm_stderr\": 0.030919401183717173,\n \"mc1\": 0.33659730722154224,\n\
\ \"mc1_stderr\": 0.016542412809494884,\n \"mc2\": 0.4947094775859723,\n\
\ \"mc2_stderr\": 0.014373065476642853\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6450511945392492,\n \"acc_stderr\": 0.013983036904094085,\n\
\ \"acc_norm\": 0.6774744027303754,\n \"acc_norm_stderr\": 0.01365998089427737\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6650069707229636,\n\
\ \"acc_stderr\": 0.004710234188047369,\n \"acc_norm\": 0.865166301533559,\n\
\ \"acc_norm_stderr\": 0.003408478333768264\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8223684210526315,\n \"acc_stderr\": 0.031103182383123384,\n\
\ \"acc_norm\": 0.8223684210526315,\n \"acc_norm_stderr\": 0.031103182383123384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7962264150943397,\n \"acc_stderr\": 0.024790784501775402,\n\
\ \"acc_norm\": 0.7962264150943397,\n \"acc_norm_stderr\": 0.024790784501775402\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.03216600808802268,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.03216600808802268\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n\
\ \"acc_stderr\": 0.0349610148119118,\n \"acc_norm\": 0.6994219653179191,\n\
\ \"acc_norm_stderr\": 0.0349610148119118\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6851063829787234,\n \"acc_stderr\": 0.030363582197238174,\n\
\ \"acc_norm\": 0.6851063829787234,\n \"acc_norm_stderr\": 0.030363582197238174\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6228070175438597,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.6228070175438597,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.03996629574876719,\n\
\ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.03996629574876719\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4947089947089947,\n \"acc_stderr\": 0.02574986828855657,\n \"\
acc_norm\": 0.4947089947089947,\n \"acc_norm_stderr\": 0.02574986828855657\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6031746031746031,\n\
\ \"acc_stderr\": 0.0437588849272706,\n \"acc_norm\": 0.6031746031746031,\n\
\ \"acc_norm_stderr\": 0.0437588849272706\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8387096774193549,\n\
\ \"acc_stderr\": 0.020923327006423298,\n \"acc_norm\": 0.8387096774193549,\n\
\ \"acc_norm_stderr\": 0.020923327006423298\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6059113300492611,\n \"acc_stderr\": 0.034381579670365446,\n\
\ \"acc_norm\": 0.6059113300492611,\n \"acc_norm_stderr\": 0.034381579670365446\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483016,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483016\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8484848484848485,\n \"acc_stderr\": 0.02554565042660362,\n \"\
acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.02554565042660362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.01673108529360755,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.01673108529360755\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7076923076923077,\n \"acc_stderr\": 0.023060438380857737,\n\
\ \"acc_norm\": 0.7076923076923077,\n \"acc_norm_stderr\": 0.023060438380857737\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3851851851851852,\n \"acc_stderr\": 0.029670906124630886,\n \
\ \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.029670906124630886\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8109243697478992,\n \"acc_stderr\": 0.02543511943810535,\n \
\ \"acc_norm\": 0.8109243697478992,\n \"acc_norm_stderr\": 0.02543511943810535\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4503311258278146,\n \"acc_stderr\": 0.04062290018683775,\n \"\
acc_norm\": 0.4503311258278146,\n \"acc_norm_stderr\": 0.04062290018683775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8844036697247707,\n \"acc_stderr\": 0.01370874953417264,\n \"\
acc_norm\": 0.8844036697247707,\n \"acc_norm_stderr\": 0.01370874953417264\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6435185185185185,\n \"acc_stderr\": 0.032664783315272714,\n \"\
acc_norm\": 0.6435185185185185,\n \"acc_norm_stderr\": 0.032664783315272714\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8676470588235294,\n \"acc_stderr\": 0.02378429752091885,\n \"\
acc_norm\": 0.8676470588235294,\n \"acc_norm_stderr\": 0.02378429752091885\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8523206751054853,\n \"acc_stderr\": 0.0230943295825957,\n \
\ \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.0230943295825957\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n\
\ \"acc_stderr\": 0.027790177064383595,\n \"acc_norm\": 0.7802690582959642,\n\
\ \"acc_norm_stderr\": 0.027790177064383595\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476074,\n\
\ \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476074\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807193,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807193\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n\
\ \"acc_stderr\": 0.03434300243631,\n \"acc_norm\": 0.8518518518518519,\n\
\ \"acc_norm_stderr\": 0.03434300243631\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5982142857142857,\n\
\ \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.5982142857142857,\n\
\ \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.03393295729761012,\n\
\ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.03393295729761012\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n\
\ \"acc_stderr\": 0.017004368568132366,\n \"acc_norm\": 0.9273504273504274,\n\
\ \"acc_norm_stderr\": 0.017004368568132366\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8697318007662835,\n\
\ \"acc_stderr\": 0.012036729568216054,\n \"acc_norm\": 0.8697318007662835,\n\
\ \"acc_norm_stderr\": 0.012036729568216054\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7890173410404624,\n \"acc_stderr\": 0.021966309947043114,\n\
\ \"acc_norm\": 0.7890173410404624,\n \"acc_norm_stderr\": 0.021966309947043114\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4782122905027933,\n\
\ \"acc_stderr\": 0.016706617522176132,\n \"acc_norm\": 0.4782122905027933,\n\
\ \"acc_norm_stderr\": 0.016706617522176132\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7810457516339869,\n \"acc_stderr\": 0.02367908986180772,\n\
\ \"acc_norm\": 0.7810457516339869,\n \"acc_norm_stderr\": 0.02367908986180772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8006430868167203,\n\
\ \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.8006430868167203,\n\
\ \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8302469135802469,\n \"acc_stderr\": 0.020888690414093868,\n\
\ \"acc_norm\": 0.8302469135802469,\n \"acc_norm_stderr\": 0.020888690414093868\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5460992907801419,\n \"acc_stderr\": 0.029700453247291467,\n \
\ \"acc_norm\": 0.5460992907801419,\n \"acc_norm_stderr\": 0.029700453247291467\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5234680573663625,\n\
\ \"acc_stderr\": 0.012756161942523339,\n \"acc_norm\": 0.5234680573663625,\n\
\ \"acc_norm_stderr\": 0.012756161942523339\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7977941176470589,\n \"acc_stderr\": 0.024398192986654924,\n\
\ \"acc_norm\": 0.7977941176470589,\n \"acc_norm_stderr\": 0.024398192986654924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7663398692810458,\n \"acc_stderr\": 0.017119158496044506,\n \
\ \"acc_norm\": 0.7663398692810458,\n \"acc_norm_stderr\": 0.017119158496044506\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866764,\n\
\ \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866764\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n\
\ \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n\
\ \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n\
\ \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33659730722154224,\n\
\ \"mc1_stderr\": 0.016542412809494884,\n \"mc2\": 0.4947094775859723,\n\
\ \"mc2_stderr\": 0.014373065476642853\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8279400157853196,\n \"acc_stderr\": 0.010607731615247005\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6163760424564063,\n \
\ \"acc_stderr\": 0.013394238584938161\n }\n}\n```"
repo_url: https://huggingface.co/GritLM/GritLM-8x7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|arc:challenge|25_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|arc:challenge|25_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|gsm8k|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|gsm8k|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hellaswag|10_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hellaswag|10_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T03-21-40.484316.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T14-24-41.959625.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T14-24-41.959625.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- '**/details_harness|winogrande|5_2024-02-21T03-21-40.484316.parquet'
- split: 2024_03_10T14_24_41.959625
path:
- '**/details_harness|winogrande|5_2024-03-10T14-24-41.959625.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T14-24-41.959625.parquet'
- config_name: results
data_files:
- split: 2024_02_21T03_21_40.484316
path:
- results_2024-02-21T03-21-40.484316.parquet
- split: 2024_03_10T14_24_41.959625
path:
- results_2024-03-10T14-24-41.959625.parquet
- split: latest
path:
- results_2024-03-10T14-24-41.959625.parquet
---
# Dataset Card for Evaluation run of GritLM/GritLM-8x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [GritLM/GritLM-8x7B](https://huggingface.co/GritLM/GritLM-8x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_GritLM__GritLM-8x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T14:24:41.959625](https://huggingface.co/datasets/open-llm-leaderboard/details_GritLM__GritLM-8x7B/blob/main/results_2024-03-10T14-24-41.959625.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7125539724588248,
"acc_stderr": 0.0303257615066163,
"acc_norm": 0.7161704881482764,
"acc_norm_stderr": 0.030919401183717173,
"mc1": 0.33659730722154224,
"mc1_stderr": 0.016542412809494884,
"mc2": 0.4947094775859723,
"mc2_stderr": 0.014373065476642853
},
"harness|arc:challenge|25": {
"acc": 0.6450511945392492,
"acc_stderr": 0.013983036904094085,
"acc_norm": 0.6774744027303754,
"acc_norm_stderr": 0.01365998089427737
},
"harness|hellaswag|10": {
"acc": 0.6650069707229636,
"acc_stderr": 0.004710234188047369,
"acc_norm": 0.865166301533559,
"acc_norm_stderr": 0.003408478333768264
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8223684210526315,
"acc_stderr": 0.031103182383123384,
"acc_norm": 0.8223684210526315,
"acc_norm_stderr": 0.031103182383123384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7962264150943397,
"acc_stderr": 0.024790784501775402,
"acc_norm": 0.7962264150943397,
"acc_norm_stderr": 0.024790784501775402
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.03216600808802268,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.03216600808802268
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0349610148119118,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0349610148119118
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6851063829787234,
"acc_stderr": 0.030363582197238174,
"acc_norm": 0.6851063829787234,
"acc_norm_stderr": 0.030363582197238174
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6228070175438597,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.6228070175438597,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.03996629574876719,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.03996629574876719
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4947089947089947,
"acc_stderr": 0.02574986828855657,
"acc_norm": 0.4947089947089947,
"acc_norm_stderr": 0.02574986828855657
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6031746031746031,
"acc_stderr": 0.0437588849272706,
"acc_norm": 0.6031746031746031,
"acc_norm_stderr": 0.0437588849272706
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8387096774193549,
"acc_stderr": 0.020923327006423298,
"acc_norm": 0.8387096774193549,
"acc_norm_stderr": 0.020923327006423298
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6059113300492611,
"acc_stderr": 0.034381579670365446,
"acc_norm": 0.6059113300492611,
"acc_norm_stderr": 0.034381579670365446
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483016,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483016
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.02554565042660362,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.02554565042660362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.01673108529360755,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.01673108529360755
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7076923076923077,
"acc_stderr": 0.023060438380857737,
"acc_norm": 0.7076923076923077,
"acc_norm_stderr": 0.023060438380857737
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.029670906124630886,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.029670906124630886
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8109243697478992,
"acc_stderr": 0.02543511943810535,
"acc_norm": 0.8109243697478992,
"acc_norm_stderr": 0.02543511943810535
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4503311258278146,
"acc_stderr": 0.04062290018683775,
"acc_norm": 0.4503311258278146,
"acc_norm_stderr": 0.04062290018683775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8844036697247707,
"acc_stderr": 0.01370874953417264,
"acc_norm": 0.8844036697247707,
"acc_norm_stderr": 0.01370874953417264
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6435185185185185,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.6435185185185185,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8676470588235294,
"acc_stderr": 0.02378429752091885,
"acc_norm": 0.8676470588235294,
"acc_norm_stderr": 0.02378429752091885
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8523206751054853,
"acc_stderr": 0.0230943295825957,
"acc_norm": 0.8523206751054853,
"acc_norm_stderr": 0.0230943295825957
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.027790177064383595,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.027790177064383595
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476074,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476074
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807193,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807193
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243631,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243631
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5982142857142857,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.5982142857142857,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.03393295729761012,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.03393295729761012
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.017004368568132366,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.017004368568132366
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8697318007662835,
"acc_stderr": 0.012036729568216054,
"acc_norm": 0.8697318007662835,
"acc_norm_stderr": 0.012036729568216054
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7890173410404624,
"acc_stderr": 0.021966309947043114,
"acc_norm": 0.7890173410404624,
"acc_norm_stderr": 0.021966309947043114
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4782122905027933,
"acc_stderr": 0.016706617522176132,
"acc_norm": 0.4782122905027933,
"acc_norm_stderr": 0.016706617522176132
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7810457516339869,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.7810457516339869,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8006430868167203,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.8006430868167203,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8302469135802469,
"acc_stderr": 0.020888690414093868,
"acc_norm": 0.8302469135802469,
"acc_norm_stderr": 0.020888690414093868
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5460992907801419,
"acc_stderr": 0.029700453247291467,
"acc_norm": 0.5460992907801419,
"acc_norm_stderr": 0.029700453247291467
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5234680573663625,
"acc_stderr": 0.012756161942523339,
"acc_norm": 0.5234680573663625,
"acc_norm_stderr": 0.012756161942523339
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7977941176470589,
"acc_stderr": 0.024398192986654924,
"acc_norm": 0.7977941176470589,
"acc_norm_stderr": 0.024398192986654924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7663398692810458,
"acc_stderr": 0.017119158496044506,
"acc_norm": 0.7663398692810458,
"acc_norm_stderr": 0.017119158496044506
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866764,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866764
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33659730722154224,
"mc1_stderr": 0.016542412809494884,
"mc2": 0.4947094775859723,
"mc2_stderr": 0.014373065476642853
},
"harness|winogrande|5": {
"acc": 0.8279400157853196,
"acc_stderr": 0.010607731615247005
},
"harness|gsm8k|5": {
"acc": 0.6163760424564063,
"acc_stderr": 0.013394238584938161
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ACT8113/DylanKlebold | ---
license: openrail
---
|
polejowska/MIST1-brain-gt-tumors | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
list:
- name: category_id
dtype:
class_label:
names:
'0': mist1
- name: image_id
dtype: string
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: segmentation
list:
list: float32
- name: iscrowd
dtype: bool
splits:
- name: train
num_bytes: 905343541.0
num_examples: 460
- name: valid
num_bytes: 75913475.0
num_examples: 40
- name: test
num_bytes: 48033661.0
num_examples: 25
download_size: 1029189310
dataset_size: 1029290677.0
---
# Dataset Card for "MIST1-brain-gt-tumors"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rahulvyasm/medical_insurance_data | ---
license: mit
---
# Dataset Card for Medical Insurance Cost Prediction
The medical insurance dataset encompasses various factors influencing medical expenses, such as age, sex, BMI, smoking status, number of children, and region. This dataset serves as a foundation for training machine learning models capable of forecasting medical expenses for new policyholders.
Its purpose is to shed light on the pivotal elements contributing to increased insurance costs, aiding the company in making more informed decisions concerning pricing and risk assessment.
## Dataset Description
The dataset contains **2.7K rows** and **7 columns**
**Columns include**
1. Age
2. Sex
3. BMI (Body Mass Index)
4. Children
5. Smoker
6. Region
7. Charges
#### Table of Contents
- [Introduction](#introduction)
- [Problem Statement](#problem-statement)
- [Features](#features)
- [Technologies Used](#technologies-used)
- [Usage](#usage)
- [Installation](#installation)
- [Data Preparation](#data-preparation)
- [Model Training](#model-training)
- [Model Evaluation](#model-evaluation)
- [Model Serialization](#model-serialization)
- [Contributors](#contributors)
- [License](#license)
#### Introduction
Healthcare costs are a significant concern for individuals and families worldwide. Predicting medical insurance costs accurately can help insurance companies determine premiums and assist individuals in planning their healthcare expenses. This project focuses on building machine learning models to predict insurance costs based on demographic and health-related attributes.
#### Problem Statement
1. What are the most important factors that affect medical expenses?
2. How well can machine learning models predict medical expenses?
3. How can machine learning models be used to improve the efficiency and profitability of health insurance companies?
#### Features
- **Data Exploration**: Explore the dataset to understand its structure, identify missing values, and analyze the distribution of features.
- **Data Preprocessing**: Prepare the data by handling categorical variables, renaming columns, and scaling numerical features.
- **Model Training**: Utilize linear regression and ridge regression models to train predictive models on the prepared dataset.
- **Pipeline Construction**: Construct a data preprocessing pipeline to streamline the process of transforming input data for model training.
- **Model Evaluation**: Evaluate model performance using metrics such as R-squared score and mean squared error to assess predictive accuracy.
- **Model Serialization**: Save trained models and pipelines to disk using the pickle library for future use.
#### Technologies Used
- **Python**: Programming language used for data manipulation, analysis, and model implementation.
- **Libraries**: NumPy, Pandas, Seaborn, Matplotlib, and Scikit-learn for data handling, visualization, and machine learning tasks.
- **Machine Learning Models**: Linear Regression, Ridge Regression
- **Pickle**: Python library used for serializing trained models and pipelines to disk.
### Dataset Sources
From multiple online and offline datasets
## Problem Statement
1. What are the primary factors influencing medical expenses?
2. How accurate are machine learning models in predicting medical expenses?
3. In what ways can machine learning models enhance the efficiency and profitability of health insurance companies? |
open-llm-leaderboard/details_TFLai__Athena-Platypus2-13B-QLora-0.80-epoch | ---
pretty_name: Evaluation run of TFLai/Athena-Platypus2-13B-QLora-0.80-epoch
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/Athena-Platypus2-13B-QLora-0.80-epoch](https://huggingface.co/TFLai/Athena-Platypus2-13B-QLora-0.80-epoch)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__Athena-Platypus2-13B-QLora-0.80-epoch\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-21T23:00:07.727248](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Athena-Platypus2-13B-QLora-0.80-epoch/blob/main/results_2023-10-21T23-00-07.727248.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.10455117449664429,\n\
\ \"em_stderr\": 0.0031334624512179676,\n \"f1\": 0.22509018456375873,\n\
\ \"f1_stderr\": 0.0034177949703821024,\n \"acc\": 0.3634414270694895,\n\
\ \"acc_stderr\": 0.006645721423171415\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.10455117449664429,\n \"em_stderr\": 0.0031334624512179676,\n\
\ \"f1\": 0.22509018456375873,\n \"f1_stderr\": 0.0034177949703821024\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \
\ \"acc_stderr\": 0.0007581501137225331\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7261247040252565,\n \"acc_stderr\": 0.012533292732620297\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TFLai/Athena-Platypus2-13B-QLora-0.80-epoch
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|arc:challenge|25_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_21T23_00_07.727248
path:
- '**/details_harness|drop|3_2023-10-21T23-00-07.727248.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-21T23-00-07.727248.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_21T23_00_07.727248
path:
- '**/details_harness|gsm8k|5_2023-10-21T23-00-07.727248.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-21T23-00-07.727248.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hellaswag|10_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T12:24:23.685858.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T12:24:23.685858.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T12:24:23.685858.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_21T23_00_07.727248
path:
- '**/details_harness|winogrande|5_2023-10-21T23-00-07.727248.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-21T23-00-07.727248.parquet'
- config_name: results
data_files:
- split: 2023_08_30T12_24_23.685858
path:
- results_2023-08-30T12:24:23.685858.parquet
- split: 2023_10_21T23_00_07.727248
path:
- results_2023-10-21T23-00-07.727248.parquet
- split: latest
path:
- results_2023-10-21T23-00-07.727248.parquet
---
# Dataset Card for Evaluation run of TFLai/Athena-Platypus2-13B-QLora-0.80-epoch
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/Athena-Platypus2-13B-QLora-0.80-epoch
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/Athena-Platypus2-13B-QLora-0.80-epoch](https://huggingface.co/TFLai/Athena-Platypus2-13B-QLora-0.80-epoch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__Athena-Platypus2-13B-QLora-0.80-epoch",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T23:00:07.727248](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Athena-Platypus2-13B-QLora-0.80-epoch/blob/main/results_2023-10-21T23-00-07.727248.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.10455117449664429,
"em_stderr": 0.0031334624512179676,
"f1": 0.22509018456375873,
"f1_stderr": 0.0034177949703821024,
"acc": 0.3634414270694895,
"acc_stderr": 0.006645721423171415
},
"harness|drop|3": {
"em": 0.10455117449664429,
"em_stderr": 0.0031334624512179676,
"f1": 0.22509018456375873,
"f1_stderr": 0.0034177949703821024
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225331
},
"harness|winogrande|5": {
"acc": 0.7261247040252565,
"acc_stderr": 0.012533292732620297
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
MatsuoDochiai/Sumpomni | ---
license: openrail
---
|
VaibhavGp69/Aarogya_MedText | ---
dataset_info:
features:
- name: Aarogya_prompt
dtype: string
- name: Prompt
dtype: string
- name: Completion
dtype: string
splits:
- name: train
num_bytes: 1305200
num_examples: 1412
download_size: 658986
dataset_size: 1305200
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_qqp_analytic_whose_relativizer | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 3705
num_examples: 17
- name: test
num_bytes: 39408
num_examples: 188
- name: train
num_bytes: 39665
num_examples: 183
download_size: 57861
dataset_size: 82778
---
# Dataset Card for "MULTI_VALUE_qqp_analytic_whose_relativizer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nan-Do/instructional_code-search-net-php | ---
dataset_info:
features:
- name: INSTRUCTION
dtype: string
- name: RESPONSE
dtype: string
- name: SOURCE
dtype: string
splits:
- name: train
num_bytes: 448756286
num_examples: 536632
download_size: 158708948
dataset_size: 448756286
license: apache-2.0
task_categories:
- conversational
- text-generation
- text2text-generation
language:
- en
tags:
- PHP
- Code Generation
- Instruction Response
pretty_name: Instructional PHP Dataset
---
# Dataset Card for "instructional_code-search-net-php"
## Dataset Description
- **Homepage:** None
- **Repository:** https://huggingface.co/datasets/Nan-Do/instructional_code-search-net-php
- **Paper:** None
- **Leaderboard:** None
- **Point of Contact:** [@Nan-Do](https://github.com/Nan-Do)
### Dataset Summary
This is an instructional dataset for PHP.
The dataset contains two different kind of tasks:
- Given a piece of code generate a description of what it does.
- Given a description generate a piece of code that fulfils the description.
### Languages
The dataset is in English.
### Data Splits
There are no splits.
## Dataset Creation
May of 2023
### Curation Rationale
This dataset was created to improve the coding capabilities of LLMs.
### Source Data
The summarized version of the code-search-net dataset can be found at https://huggingface.co/datasets/Nan-Do/code-search-net-php
### Annotations
The dataset includes an instruction and response columns.
#### Annotation process
The annotation procedure was done using templates and NLP techniques to generate human-like instructions and responses.
A sample notebook of the process can be found at https://github.com/Nan-Do/OpenAssistantInstructionResponsePython
The annontations have been cleaned to make sure there are no repetitions and/or meaningless summaries.
### Licensing Information
Apache 2.0
|
huggingface-ml-4-games-course/unity-demos | ---
license: apache-2.0
---
# Unity Demos 🎮
This dataset contains the Unity demos for the **[ML for Games course](https://huggingface.co/learn/ml-games-course/unit0/introduction)**
The course's link 👉 https://huggingface.co/learn/ml-games-course/unit0/introduction
<img src="https://huggingface.co/datasets/huggingface-ml-4-games-course/course-images/resolve/main/en/unit0/thumbnail.jpg" alt="ML for Games course"/>
|
open-llm-leaderboard/details_Felladrin__Minueza-32Mx2-Chat | ---
pretty_name: Evaluation run of Felladrin/Minueza-32Mx2-Chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Felladrin/Minueza-32Mx2-Chat](https://huggingface.co/Felladrin/Minueza-32Mx2-Chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Felladrin__Minueza-32Mx2-Chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T19:34:20.089689](https://huggingface.co/datasets/open-llm-leaderboard/details_Felladrin__Minueza-32Mx2-Chat/blob/main/results_2024-03-09T19-34-20.089689.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25904930626678757,\n\
\ \"acc_stderr\": 0.03081108979139557,\n \"acc_norm\": 0.25975067177423705,\n\
\ \"acc_norm_stderr\": 0.03163420298964525,\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.015345409485557994,\n \"mc2\": 0.4455926367351534,\n\
\ \"mc2_stderr\": 0.015305936450342793\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.16040955631399317,\n \"acc_stderr\": 0.010724336059110964,\n\
\ \"acc_norm\": 0.20136518771331058,\n \"acc_norm_stderr\": 0.01171892747744427\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2650866361282613,\n\
\ \"acc_stderr\": 0.004404772735765963,\n \"acc_norm\": 0.2635929097789285,\n\
\ \"acc_norm_stderr\": 0.004396806562351326\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n\
\ \"acc_stderr\": 0.037125378336148665,\n \"acc_norm\": 0.24444444444444444,\n\
\ \"acc_norm_stderr\": 0.037125378336148665\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.03782728980865469,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.03782728980865469\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.02544786382510861,\n\
\ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.02544786382510861\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080343,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080343\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20425531914893616,\n \"acc_stderr\": 0.026355158413349424,\n\
\ \"acc_norm\": 0.20425531914893616,\n \"acc_norm_stderr\": 0.026355158413349424\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.03892431106518754,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.03892431106518754\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.036951833116502325,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.036951833116502325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15873015873015872,\n\
\ \"acc_stderr\": 0.03268454013011743,\n \"acc_norm\": 0.15873015873015872,\n\
\ \"acc_norm_stderr\": 0.03268454013011743\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\
\ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\
\ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868408,\n\
\ \"acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868408\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2676767676767677,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.2676767676767677,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.32124352331606215,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.32124352331606215,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.36153846153846153,\n \"acc_stderr\": 0.024359581465396983,\n\
\ \"acc_norm\": 0.36153846153846153,\n \"acc_norm_stderr\": 0.024359581465396983\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073828,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073828\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.031041941304059288,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.031041941304059288\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24220183486238533,\n \"acc_stderr\": 0.01836817630659862,\n \"\
acc_norm\": 0.24220183486238533,\n \"acc_norm_stderr\": 0.01836817630659862\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.030964517926923403,\n \"\
acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.030964517926923403\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.23628691983122363,\n \"acc_stderr\": 0.02765215314415927,\n \
\ \"acc_norm\": 0.23628691983122363,\n \"acc_norm_stderr\": 0.02765215314415927\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.20179372197309417,\n\
\ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.20179372197309417,\n\
\ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2900763358778626,\n \"acc_stderr\": 0.03980066246467765,\n\
\ \"acc_norm\": 0.2900763358778626,\n \"acc_norm_stderr\": 0.03980066246467765\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16964285714285715,\n\
\ \"acc_stderr\": 0.035623678500953895,\n \"acc_norm\": 0.16964285714285715,\n\
\ \"acc_norm_stderr\": 0.035623678500953895\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26947637292464877,\n\
\ \"acc_stderr\": 0.01586624307321506,\n \"acc_norm\": 0.26947637292464877,\n\
\ \"acc_norm_stderr\": 0.01586624307321506\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.21676300578034682,\n \"acc_stderr\": 0.022183477668412856,\n\
\ \"acc_norm\": 0.21676300578034682,\n \"acc_norm_stderr\": 0.022183477668412856\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767864,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767864\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.024170840879341026,\n\
\ \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.024170840879341026\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n\
\ \"acc_stderr\": 0.025494259350694888,\n \"acc_norm\": 0.2797427652733119,\n\
\ \"acc_norm_stderr\": 0.025494259350694888\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2765957446808511,\n \"acc_stderr\": 0.026684564340460997,\n \
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.026684564340460997\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24837027379400262,\n\
\ \"acc_stderr\": 0.011035212598034498,\n \"acc_norm\": 0.24837027379400262,\n\
\ \"acc_norm_stderr\": 0.011035212598034498\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329376,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329376\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2173202614379085,\n \"acc_stderr\": 0.01668482092914859,\n \
\ \"acc_norm\": 0.2173202614379085,\n \"acc_norm_stderr\": 0.01668482092914859\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\
\ \"acc_stderr\": 0.04069306319721376,\n \"acc_norm\": 0.23636363636363636,\n\
\ \"acc_norm_stderr\": 0.04069306319721376\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.23265306122448978,\n \"acc_stderr\": 0.027049257915896175,\n\
\ \"acc_norm\": 0.23265306122448978,\n \"acc_norm_stderr\": 0.027049257915896175\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21890547263681592,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.21890547263681592,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.18674698795180722,\n\
\ \"acc_stderr\": 0.03033874914450061,\n \"acc_norm\": 0.18674698795180722,\n\
\ \"acc_norm_stderr\": 0.03033874914450061\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03126781714663178,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03126781714663178\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.015345409485557994,\n \"mc2\": 0.4455926367351534,\n\
\ \"mc2_stderr\": 0.015305936450342793\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.516179952644041,\n \"acc_stderr\": 0.014045126130978608\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Felladrin/Minueza-32Mx2-Chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|arc:challenge|25_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|gsm8k|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hellaswag|10_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T19-34-20.089689.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T19-34-20.089689.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- '**/details_harness|winogrande|5_2024-03-09T19-34-20.089689.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T19-34-20.089689.parquet'
- config_name: results
data_files:
- split: 2024_03_09T19_34_20.089689
path:
- results_2024-03-09T19-34-20.089689.parquet
- split: latest
path:
- results_2024-03-09T19-34-20.089689.parquet
---
# Dataset Card for Evaluation run of Felladrin/Minueza-32Mx2-Chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Felladrin/Minueza-32Mx2-Chat](https://huggingface.co/Felladrin/Minueza-32Mx2-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Felladrin__Minueza-32Mx2-Chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T19:34:20.089689](https://huggingface.co/datasets/open-llm-leaderboard/details_Felladrin__Minueza-32Mx2-Chat/blob/main/results_2024-03-09T19-34-20.089689.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25904930626678757,
"acc_stderr": 0.03081108979139557,
"acc_norm": 0.25975067177423705,
"acc_norm_stderr": 0.03163420298964525,
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557994,
"mc2": 0.4455926367351534,
"mc2_stderr": 0.015305936450342793
},
"harness|arc:challenge|25": {
"acc": 0.16040955631399317,
"acc_stderr": 0.010724336059110964,
"acc_norm": 0.20136518771331058,
"acc_norm_stderr": 0.01171892747744427
},
"harness|hellaswag|10": {
"acc": 0.2650866361282613,
"acc_stderr": 0.004404772735765963,
"acc_norm": 0.2635929097789285,
"acc_norm_stderr": 0.004396806562351326
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.037125378336148665,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.037125378336148665
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.03782728980865469,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.03782728980865469
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.02544786382510861,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.02544786382510861
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080343,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080343
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20425531914893616,
"acc_stderr": 0.026355158413349424,
"acc_norm": 0.20425531914893616,
"acc_norm_stderr": 0.026355158413349424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518754,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518754
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708617,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708617
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15873015873015872,
"acc_stderr": 0.03268454013011743,
"acc_norm": 0.15873015873015872,
"acc_norm_stderr": 0.03268454013011743
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868408,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868408
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2676767676767677,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.2676767676767677,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.32124352331606215,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.32124352331606215,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36153846153846153,
"acc_stderr": 0.024359581465396983,
"acc_norm": 0.36153846153846153,
"acc_norm_stderr": 0.024359581465396983
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073828,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073828
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.031041941304059288,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.031041941304059288
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24220183486238533,
"acc_stderr": 0.01836817630659862,
"acc_norm": 0.24220183486238533,
"acc_norm_stderr": 0.01836817630659862
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.23628691983122363,
"acc_stderr": 0.02765215314415927,
"acc_norm": 0.23628691983122363,
"acc_norm_stderr": 0.02765215314415927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.20179372197309417,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.20179372197309417,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2900763358778626,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.2900763358778626,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16964285714285715,
"acc_stderr": 0.035623678500953895,
"acc_norm": 0.16964285714285715,
"acc_norm_stderr": 0.035623678500953895
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26947637292464877,
"acc_stderr": 0.01586624307321506,
"acc_norm": 0.26947637292464877,
"acc_norm_stderr": 0.01586624307321506
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21676300578034682,
"acc_stderr": 0.022183477668412856,
"acc_norm": 0.21676300578034682,
"acc_norm_stderr": 0.022183477668412856
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767864,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767864
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.024170840879341026,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.024170840879341026
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2797427652733119,
"acc_stderr": 0.025494259350694888,
"acc_norm": 0.2797427652733119,
"acc_norm_stderr": 0.025494259350694888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25308641975308643,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.25308641975308643,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.026684564340460997,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.026684564340460997
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24837027379400262,
"acc_stderr": 0.011035212598034498,
"acc_norm": 0.24837027379400262,
"acc_norm_stderr": 0.011035212598034498
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329376,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329376
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2173202614379085,
"acc_stderr": 0.01668482092914859,
"acc_norm": 0.2173202614379085,
"acc_norm_stderr": 0.01668482092914859
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721376,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721376
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.23265306122448978,
"acc_stderr": 0.027049257915896175,
"acc_norm": 0.23265306122448978,
"acc_norm_stderr": 0.027049257915896175
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21890547263681592,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.21890547263681592,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-virology|5": {
"acc": 0.18674698795180722,
"acc_stderr": 0.03033874914450061,
"acc_norm": 0.18674698795180722,
"acc_norm_stderr": 0.03033874914450061
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03126781714663178,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03126781714663178
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557994,
"mc2": 0.4455926367351534,
"mc2_stderr": 0.015305936450342793
},
"harness|winogrande|5": {
"acc": 0.516179952644041,
"acc_stderr": 0.014045126130978608
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lucyd/deepgen | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 27437
num_examples: 267
download_size: 12561
dataset_size: 27437
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jondurbin/truthy-dpo-v0.1 | ---
license: cc-by-4.0
---
## Truthy DPO
This is a dataset designed to enhance the overall truthfulness of LLMs, without sacrificing immersion when roleplaying as a human.
For example, in normal AI assistant model, the model should not try to describe what the warmth of the sun feels like, but if the system prompt indicates it's a human, it should.
Mostly targets corporeal, spacial, temporal awareness, and common misconceptions.
### Contribute
If you're interested in new functionality/datasets, take a look at [bagel repo](https://github.com/jondurbin/bagel) and [airoboros](https://github.com/jondurbin/airoboros) and either make a PR or open an issue with details.
To help me with the fine-tuning costs, dataset generation, etc., please use one of the following:
- https://bmc.link/jondurbin
- ETH 0xce914eAFC2fe52FdceE59565Dd92c06f776fcb11
- BTC bc1qdwuth4vlg8x37ggntlxu5cjfwgmdy5zaa7pswf |
stockeh/dog-pose-cv | ---
license: apache-2.0
task_categories:
- image-classification
language:
- en
size_categories:
- 10K<n<100K
pretty_name: DogPoseCV
---
# Dataset Card for DogPoseCV
This dataset contains 20,578 images of dogs in various poses, labeled as `standing`, `sitting`, `lying down`, or `undefined`. It is intended for computer vision tasks to identify a dog's behavior from images.
## Dataset Details
- **Curated by:** Jason Stock and Tom Cavey, Computer Science, Colorado State University
- **Paper:** [arxiv.org/abs/2101.02380](https://arxiv.org/abs/2101.02380) ([BibTeX](#citation))
- **Repository:** [github.com/stockeh/canine-embedded-ml](https://github.com/stockeh/canine-embedded-ml)
The dataset is intended to be used to train computer vision models to identify a dog's pose/behavior (standing, sitting, lying down) from images. This can enable applications to automatically detect and respond to a dog's actions. The variety of dog breeds enables robust generalization for real-time inference of dog actions.
### Dataset Structure
The dataset contains 20,578 RGB images of 120 dog breeds. Images are labeled as one of four classes:
- standing (4143 images)
- sitting (3038 images)
- lying down (7090 images)
- undefined (6307 images)
Images have varying resolutions, with 50% between 361x333 and 500x453 pixels.
#### Data Collection and Processing
This dataset is an adaption of from the [Stanford Dog Dataset](http://vision.stanford.edu/aditya86/ImageNetDogs/), relabeling dog breeds to their associated position. We manually labeled each image as `standing`, `sitting`, `lying down`, or `undefined` if the pose was indistinguishable, e.g., between two positions.
## Bias, Risks, and Limitations
The dataset has a class imbalance, with nearly 2x as many "lying down" images compared to "sitting". Indistinguishable poses were labeled as "undefined", with most being close-up portraits. This may limit the ability to handle such images.
**Recommendations**: When using this dataset, be aware of the class imbalance and consider oversampling or augmentation techniques..
## Citation
```
@article{stock2021s,
title={Who's a Good Boy? Reinforcing Canine Behavior in Real-Time using Machine Learning},
author={Stock, Jason and Cavey, Tom},
journal={arXiv preprint arXiv:2101.02380},
year={2021}
}
``` |
mankness/ecommerce-faq | ---
pretty_name: ecommerce-faq
--- |
jordanfan/processed_us_congress_117_bills_v3 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: index
dtype: int64
- name: id
dtype: string
- name: policy_areas
dtype: string
- name: cur_summary
dtype: string
- name: cur_text
dtype: string
- name: title
dtype: string
- name: titles_official
dtype: string
- name: titles_short
dtype: string
- name: sponsor_name
dtype: string
- name: sponsor_party
dtype: string
- name: sponsor_state
dtype: string
- name: cleaned_summary
dtype: string
- name: extracted_text
dtype: string
- name: extracted_text_375
dtype: string
- name: extracted_text_750
dtype: string
- name: extracted_text_1000
dtype: string
- name: bertsum_extracted_250
dtype: string
- name: bertsum_extracted_375
dtype: string
- name: bertsum_extracted_375_1000
dtype: string
- name: bertsum_extracted_250_1000
dtype: string
- name: bertsum_extracted_375_750
dtype: string
- name: bertsum_extracted_250_750
dtype: string
- name: bertsum_extracted_375_500
dtype: string
- name: bertsum_extracted_250_500
dtype: string
- name: bertsum_extracted_375_375
dtype: string
- name: bertsum_extracted_250_375
dtype: string
splits:
- name: train
num_bytes: 614026113
num_examples: 11277
- name: val
num_bytes: 179492083
num_examples: 3388
- name: test
num_bytes: 28166503
num_examples: 377
download_size: 355877521
dataset_size: 821684699
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
autoevaluate/autoeval-eval-futin__feed-top_vi-b5257d-2174969941 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/feed
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-3b
metrics: []
dataset_name: futin/feed
dataset_config: top_vi
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-3b
* Dataset: futin/feed
* Config: top_vi
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
deepalpv/lisa | ---
license: osl-3.0
---
|
minoruskore/isbl | ---
dataset_info:
features:
- name: image
dtype: image
- name: tags
dtype: string
splits:
- name: train
num_bytes: 406713495.0
num_examples: 32
download_size: 406657072
dataset_size: 406713495.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DopeorNope/new_instruct_no_ssl | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 130301895
num_examples: 121332
download_size: 78246124
dataset_size: 130301895
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/myrrh_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of myrrh/ミルラ/末药 (Arknights)
This is the dataset of myrrh/ミルラ/末药 (Arknights), containing 87 images and their tags.
The core tags of this character are `animal_ears, green_eyes, glasses, red_hair, fox_ears, short_hair, hair_ornament, ahoge, hair_over_one_eye, tail, fox_tail, fox_girl`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 87 | 128.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/myrrh_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 87 | 111.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/myrrh_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 203 | 212.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/myrrh_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/myrrh_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 33 |  |  |  |  |  | 1girl, solo, brown_gloves, shirt, holding, cape, long_sleeves, looking_at_viewer, vial, black_skirt, bandaged_leg, black_choker, id_card, simple_background, thigh_strap, bag, test_tube, white_background, full_body, bow, smile |
| 1 | 12 |  |  |  |  |  | 1girl, ears_through_headwear, solo, bare_shoulders, black_headwear, holding, short_sleeves, hat, looking_at_viewer, official_alternate_costume, yellow_shirt, backpack, black_scarf, off_shoulder, standing, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | brown_gloves | shirt | holding | cape | long_sleeves | looking_at_viewer | vial | black_skirt | bandaged_leg | black_choker | id_card | simple_background | thigh_strap | bag | test_tube | white_background | full_body | bow | smile | ears_through_headwear | bare_shoulders | black_headwear | short_sleeves | hat | official_alternate_costume | yellow_shirt | backpack | black_scarf | off_shoulder | standing | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:--------|:----------|:-------|:---------------|:--------------------|:-------|:--------------|:---------------|:---------------|:----------|:--------------------|:--------------|:------|:------------|:-------------------|:------------|:------|:--------|:------------------------|:-----------------|:-----------------|:----------------|:------|:-----------------------------|:---------------|:-----------|:--------------|:---------------|:-----------|:-------------|
| 0 | 33 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | X | | | X | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
vikp/doclaynet_processed | ---
dataset_info:
features:
- name: image
dtype: image
- name: bboxes
sequence:
sequence: float64
- name: labels
sequence: int64
- name: words
sequence: string
- name: split
dtype: string
splits:
- name: train
num_bytes: 32034973965.125
num_examples: 80863
download_size: 0
dataset_size: 32034973965.125
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "doclaynet_processed"
Clean version of [DocLayNet](https://github.com/DS4SD/DocLayNet) ready for finetuning. |
yiyic/clirmatrix | ---
dataset_info:
features:
- name: text
dtype: string
- name: rate
dtype: int64
- name: __index_level_0__
dtype: string
splits:
- name: de_en_multi8_test1
num_bytes: 1334400
num_examples: 1000
- name: de_fr_multi8_test1
num_bytes: 1336714
num_examples: 1000
- name: de_es_multi8_test1
num_bytes: 1336408
num_examples: 1000
- name: en_de_multi8_test1
num_bytes: 1146916
num_examples: 1000
- name: en_fr_multi8_test1
num_bytes: 1148710
num_examples: 1000
- name: en_es_multi8_test1
num_bytes: 1148404
num_examples: 1000
- name: es_en_multi8_test1
num_bytes: 1119660
num_examples: 1000
- name: es_fr_multi8_test1
num_bytes: 1121974
num_examples: 1000
- name: es_de_multi8_test1
num_bytes: 1120180
num_examples: 1000
- name: fr_en_multi8_test1
num_bytes: 1161002
num_examples: 1000
- name: fr_de_multi8_test1
num_bytes: 1161522
num_examples: 1000
- name: fr_es_multi8_test1
num_bytes: 1163010
num_examples: 1000
download_size: 8823803
dataset_size: 14298900
configs:
- config_name: default
data_files:
- split: de_en_multi8_test1
path: data/de_en_multi8_test1-*
- split: de_fr_multi8_test1
path: data/de_fr_multi8_test1-*
- split: de_es_multi8_test1
path: data/de_es_multi8_test1-*
- split: en_de_multi8_test1
path: data/en_de_multi8_test1-*
- split: en_fr_multi8_test1
path: data/en_fr_multi8_test1-*
- split: en_es_multi8_test1
path: data/en_es_multi8_test1-*
- split: es_en_multi8_test1
path: data/es_en_multi8_test1-*
- split: es_fr_multi8_test1
path: data/es_fr_multi8_test1-*
- split: es_de_multi8_test1
path: data/es_de_multi8_test1-*
- split: fr_en_multi8_test1
path: data/fr_en_multi8_test1-*
- split: fr_de_multi8_test1
path: data/fr_de_multi8_test1-*
- split: fr_es_multi8_test1
path: data/fr_es_multi8_test1-*
---
|
Mitsuki-Sakamoto/fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.1_seed_2_t_1.0_eval | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
- name: gen_proxy_reward
dtype: float64
- name: gen_gold_reward
dtype: float64
splits:
- name: epoch_0
num_bytes: 44053127
num_examples: 18928
- name: epoch_1
num_bytes: 44669586
num_examples: 18928
- name: epoch_10
num_bytes: 44730949
num_examples: 18928
- name: epoch_11
num_bytes: 44730605
num_examples: 18928
- name: epoch_12
num_bytes: 44730117
num_examples: 18928
- name: epoch_13
num_bytes: 44728361
num_examples: 18928
- name: epoch_14
num_bytes: 44730267
num_examples: 18928
- name: epoch_15
num_bytes: 44728443
num_examples: 18928
- name: epoch_16
num_bytes: 44728791
num_examples: 18928
- name: epoch_17
num_bytes: 44729768
num_examples: 18928
- name: epoch_18
num_bytes: 44729337
num_examples: 18928
- name: epoch_19
num_bytes: 44729952
num_examples: 18928
- name: epoch_2
num_bytes: 44733170
num_examples: 18928
- name: epoch_20
num_bytes: 44730371
num_examples: 18928
- name: epoch_21
num_bytes: 44730305
num_examples: 18928
- name: epoch_22
num_bytes: 44729540
num_examples: 18928
- name: epoch_23
num_bytes: 44729640
num_examples: 18928
- name: epoch_24
num_bytes: 44730718
num_examples: 18928
- name: epoch_25
num_bytes: 44731263
num_examples: 18928
- name: epoch_26
num_bytes: 44729373
num_examples: 18928
- name: epoch_27
num_bytes: 44729728
num_examples: 18928
- name: epoch_28
num_bytes: 44729738
num_examples: 18928
- name: epoch_29
num_bytes: 44729945
num_examples: 18928
- name: epoch_3
num_bytes: 44770625
num_examples: 18928
- name: epoch_4
num_bytes: 44776461
num_examples: 18928
- name: epoch_5
num_bytes: 44762728
num_examples: 18928
- name: epoch_6
num_bytes: 44749183
num_examples: 18928
- name: epoch_7
num_bytes: 44739230
num_examples: 18928
- name: epoch_8
num_bytes: 44733018
num_examples: 18928
- name: epoch_9
num_bytes: 44733757
num_examples: 18928
download_size: 710026115
dataset_size: 1341318096
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
---
# Dataset Card for "fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.1_seed_2_t_1.0_eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
senhorsapo/nicole | ---
license: openrail
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/fe795838 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 180
num_examples: 10
download_size: 1331
dataset_size: 180
---
# Dataset Card for "fe795838"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jiangyige/PSP5 | ---
license: unknown
---
---
Description:
The Paraphrased Sentence Pairs - 5 types (PSP-5) dataset comprises five fundamental categories of paraphrased English sentence pairs:
1. Declarative sentences (statements)
2. Interrogative sentences (questions)
3. Imperative sentences (commands)
4. Exclamatory sentences (exclamations)
5. Sentence fragments (oral English).
There are 3 columns in the table:
1. sentence
2. chatGPT_paraphrased
3. type
10000 sentence pairs in all.
--- |
feedback-to-code/Server_Text_Dataset_1 | ---
license: apache-2.0
---
|
yangwang825/sst2-pwws-2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: augment
dtype: string
splits:
- name: train
num_bytes: 2593835
num_examples: 20728
- name: validation
num_bytes: 110096
num_examples: 872
- name: test
num_bytes: 226340
num_examples: 1821
download_size: 1120309
dataset_size: 2930271
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Anusha64/train-dataset-aeon | ---
license: mit
dataset_info:
features:
- name: instruction
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 70601
num_examples: 31
- name: validation
num_bytes: 9705
num_examples: 5
- name: test
num_bytes: 16493
num_examples: 7
download_size: 75537
dataset_size: 96799
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
JorangHorse/Third | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 1213654.0
num_examples: 2
download_size: 623252
dataset_size: 1213654.0
---
# Dataset Card for "Third"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ramunenavath/mydataset | ---
license: openrail
---
|
joey234/mmlu-college_chemistry-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 7604
num_examples: 5
- name: test
num_bytes: 807404
num_examples: 100
download_size: 137885
dataset_size: 815008
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-college_chemistry-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
darssanle/Primate_Dataset_With_Specifics | ---
license: mit
task_categories:
- text-classification
language:
- en
tags:
- medical
pretty_name: primate_dataset
size_categories:
- 10K<n<100K
--- |
helenqu/astro-classification-redshifts | ---
license: mit
tags:
- time series
- astrophysics
- pretraining
- connect-later
size_categories:
- 100K<n<1M
---
# AstroClassification and Redshifts Datasets
<!-- Provide a quick summary of the dataset. -->
This dataset was used for the AstroClassification and Redshifts introduced in [Connect Later: Improving Fine-tuning for Robustness with Targeted Augmentations](). This is a dataset of simulated astronomical time-series (e.g., supernovae, active galactic nuclei), and the task is to classify the object type (AstroClassification) or predict the object's redshift (Redshifts).
- **Repository:** https://github.com/helenqu/connect-later
- **Paper:** will be updated
- **Point of Contact: Helen Qu (<helenqu@sas.upenn.edu>)**
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
- **object_id**: unique object identifier
- **times_wv**: 2D array of shape (N, 2) containing the observation times (modified Julian days, MJD) and filter (wavelength in nm) for each observation, N=number of observations
- **lightcurve**: 2D array of shape (N, 2) containing the flux (arbitrary units) and flux error for each observation
- **label**: integer representing the class of the object (see below for details)
- **redshift**: redshift of the object
## Dataset Creation
### Source Data
This is a modified version of the dataset from the 2018 Photometric LSST Astronomical Time-Series Classification Challenge (PLAsTiCC) Kaggle competition
The original Kaggle competition can be found [here](https://www.kaggle.com/c/PLAsTiCC-2018). [This note](https://arxiv.org/abs/1810.00001) from the competition describes the dataset in detail. Astronomers may be interested in [this paper](https://arxiv.org/abs/1903.11756) describing the simulations used to generate the data.
- **Train**: 80% of the original PLAsTiCC training set augmented using the redshifting targeted augmentation described in the Connect Later paper
- **Validation**: Remaining 20% of the original PLAsTiCC training set, *not* augmented or modified
- **Test**: Subset of 10,000 objects randomly selected from the PLAsTiCC test set
### Object Types
```
0: microlens-single
1: tidal disruption event (TDE)
2: eclipsing binary (EB)
3: type II supernova (SNII)
4: peculiar type Ia supernova (SNIax)
5: Mira variable
6: type Ibc supernova(SNIbc)
7: kilonova (KN)
8: M-dwarf
9: peculiar type Ia supernova (SNIa-91bg)
10: active galactic nuclei (AGN)
11: type Ia supernova (SNIa)
12: RR-Lyrae (RRL)
13: superluminous supernova (SLSN-I)
14: 5 "anomalous" types that are not present in training set: microlens-binary, intermediate luminosity optical transient (ILOT), calcium-rich transient (CaRT), pair instability supernova (PISN), microlens-string
```
## Citation
will be updated |
ekolasky/DWIEForCustomLEDConsol | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: result_labels
sequence: int64
- name: grouping_vector
sequence:
sequence: int64
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 4305967
num_examples: 500
- name: validation
num_bytes: 876462
num_examples: 96
download_size: 925707
dataset_size: 5182429
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
passionMan/mimic_tokenized_dataset_balanced_frac_0.1 | ---
dataset_info:
features:
- name: context
dtype: string
- name: label
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 195080305
num_examples: 64761
- name: test
num_bytes: 65034069
num_examples: 21588
download_size: 36831082
dataset_size: 260114374
---
# Dataset Card for "mimic_tokenized_dataset_balanced_frac_0.1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cmu-mlsp/librispeech960-encodec1024_asr | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
- split: validation_other
path: data/validation_other-*
- split: test_other
path: data/test_other-*
dataset_info:
features:
- name: text
dtype: string
- name: audio_codes
sequence: string
- name: id
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
splits:
- name: train
num_bytes: 1859401929
num_examples: 281241
- name: validation
num_bytes: 10515210
num_examples: 2703
- name: test
num_bytes: 10516648
num_examples: 2620
- name: validation_other
num_bytes: 9974741
num_examples: 2864
- name: test_other
num_bytes: 10389123
num_examples: 2939
download_size: 0
dataset_size: 1900797651
---
# Dataset Card for "librispeech960-encodec1024_asr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bio-datasets/dft23-full | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: answer_a
dtype: string
- name: answer_b
dtype: string
- name: answer_c
dtype: string
- name: answer_d
dtype: string
- name: answer_e
dtype: string
- name: correct_answers
sequence:
class_label:
names:
'0': a
'1': b
'2': c
'3': d
'4': e
- name: subject_name
dtype: string
- name: number_correct_answers
dtype:
class_label:
names:
'0': '1'
'1': '2'
'2': '3'
'3': '4'
'4': '5'
splits:
- name: train
num_bytes: 1004721
num_examples: 2171
- name: validation
num_bytes: 136786
num_examples: 312
- name: test
num_bytes: 284765
num_examples: 622
download_size: 894075
dataset_size: 1426272
---
# Dataset Card for "dft23-full"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mrpc_give_passive | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 14459
num_examples: 56
- name: train
num_bytes: 27981
num_examples: 109
- name: validation
num_bytes: 2653
num_examples: 10
download_size: 39594
dataset_size: 45093
---
# Dataset Card for "MULTI_VALUE_mrpc_give_passive"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chansung/llama2-stories | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: title
dtype: string
- name: image
dtype: string
- name: story
dtype: string
splits:
- name: train
num_bytes: 4356500
num_examples: 73
download_size: 3539195
dataset_size: 4356500
---
|
carnival13/rbrt_uda_lrg_ep5_2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: domain_label
dtype: int64
- name: pass_label
dtype: int64
- name: input
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 1115662838
num_examples: 755110
download_size: 352431197
dataset_size: 1115662838
---
# Dataset Card for "rbrt_uda_lrg_ep5_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
imanmalhi/canada_realestate_listings | ---
license: mit
---
|
open-llm-leaderboard/details_Sao10K__NyakuraV2.1-m7 | ---
pretty_name: Evaluation run of Sao10K/NyakuraV2.1-m7
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/NyakuraV2.1-m7](https://huggingface.co/Sao10K/NyakuraV2.1-m7) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__NyakuraV2.1-m7\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-12T04:30:54.576577](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__NyakuraV2.1-m7/blob/main/results_2023-12-12T04-30-54.576577.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5812856791159661,\n\
\ \"acc_stderr\": 0.03351473539841468,\n \"acc_norm\": 0.5885734680789351,\n\
\ \"acc_norm_stderr\": 0.03422448074980651,\n \"mc1\": 0.29498164014687883,\n\
\ \"mc1_stderr\": 0.015964400965589664,\n \"mc2\": 0.45008851442315223,\n\
\ \"mc2_stderr\": 0.015144388624059283\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5511945392491467,\n \"acc_stderr\": 0.014534599585097662,\n\
\ \"acc_norm\": 0.5861774744027304,\n \"acc_norm_stderr\": 0.014392730009221007\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6320454092810197,\n\
\ \"acc_stderr\": 0.004812633280078261,\n \"acc_norm\": 0.8188607847042422,\n\
\ \"acc_norm_stderr\": 0.003843463792037909\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.630188679245283,\n \"acc_stderr\": 0.02971142188010793,\n\
\ \"acc_norm\": 0.630188679245283,\n \"acc_norm_stderr\": 0.02971142188010793\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.03257901482099834,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.03257901482099834\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\"\
: 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6838709677419355,\n \"acc_stderr\": 0.026450874489042774,\n \"\
acc_norm\": 0.6838709677419355,\n \"acc_norm_stderr\": 0.026450874489042774\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164542,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164542\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5461538461538461,\n \"acc_stderr\": 0.025242770987126184,\n\
\ \"acc_norm\": 0.5461538461538461,\n \"acc_norm_stderr\": 0.025242770987126184\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652459,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652459\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.031968769891957786,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.031968769891957786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659807,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659807\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.726605504587156,\n \"acc_stderr\": 0.019109299846098292,\n \"\
acc_norm\": 0.726605504587156,\n \"acc_norm_stderr\": 0.019109299846098292\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7205882352941176,\n \"acc_stderr\": 0.03149328104507957,\n \"\
acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.03149328104507957\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7172995780590717,\n \"acc_stderr\": 0.029312814153955924,\n \
\ \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.029312814153955924\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.04039314978724561,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.04039314978724561\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.04742762361243011,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.04742762361243011\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.026655699653922737,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.026655699653922737\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7624521072796935,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.7624521072796935,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.02572280220089581,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.02572280220089581\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25139664804469275,\n\
\ \"acc_stderr\": 0.01450897945355397,\n \"acc_norm\": 0.25139664804469275,\n\
\ \"acc_norm_stderr\": 0.01450897945355397\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.027363593284684965,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.027363593284684965\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n\
\ \"acc_stderr\": 0.026920841260776165,\n \"acc_norm\": 0.6591639871382636,\n\
\ \"acc_norm_stderr\": 0.026920841260776165\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.02651759772446501,\n\
\ \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.02651759772446501\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255856,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255856\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40547588005215124,\n\
\ \"acc_stderr\": 0.012539960672377202,\n \"acc_norm\": 0.40547588005215124,\n\
\ \"acc_norm_stderr\": 0.012539960672377202\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5661764705882353,\n \"acc_stderr\": 0.030105636570016633,\n\
\ \"acc_norm\": 0.5661764705882353,\n \"acc_norm_stderr\": 0.030105636570016633\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5931372549019608,\n \"acc_stderr\": 0.019873802005061177,\n \
\ \"acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.019873802005061177\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.030021056238440307,\n\
\ \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.030021056238440307\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.02619392354445414,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.02619392354445414\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.0330140594698725,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.0330140594698725\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29498164014687883,\n\
\ \"mc1_stderr\": 0.015964400965589664,\n \"mc2\": 0.45008851442315223,\n\
\ \"mc2_stderr\": 0.015144388624059283\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7277032359905288,\n \"acc_stderr\": 0.012510697991453934\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2266868840030326,\n \
\ \"acc_stderr\": 0.011532758009339995\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/NyakuraV2.1-m7
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|arc:challenge|25_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|gsm8k|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hellaswag|10_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T04-30-54.576577.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-12T04-30-54.576577.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- '**/details_harness|winogrande|5_2023-12-12T04-30-54.576577.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-12T04-30-54.576577.parquet'
- config_name: results
data_files:
- split: 2023_12_12T04_30_54.576577
path:
- results_2023-12-12T04-30-54.576577.parquet
- split: latest
path:
- results_2023-12-12T04-30-54.576577.parquet
---
# Dataset Card for Evaluation run of Sao10K/NyakuraV2.1-m7
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Sao10K/NyakuraV2.1-m7](https://huggingface.co/Sao10K/NyakuraV2.1-m7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__NyakuraV2.1-m7",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-12T04:30:54.576577](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__NyakuraV2.1-m7/blob/main/results_2023-12-12T04-30-54.576577.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5812856791159661,
"acc_stderr": 0.03351473539841468,
"acc_norm": 0.5885734680789351,
"acc_norm_stderr": 0.03422448074980651,
"mc1": 0.29498164014687883,
"mc1_stderr": 0.015964400965589664,
"mc2": 0.45008851442315223,
"mc2_stderr": 0.015144388624059283
},
"harness|arc:challenge|25": {
"acc": 0.5511945392491467,
"acc_stderr": 0.014534599585097662,
"acc_norm": 0.5861774744027304,
"acc_norm_stderr": 0.014392730009221007
},
"harness|hellaswag|10": {
"acc": 0.6320454092810197,
"acc_stderr": 0.004812633280078261,
"acc_norm": 0.8188607847042422,
"acc_norm_stderr": 0.003843463792037909
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.630188679245283,
"acc_stderr": 0.02971142188010793,
"acc_norm": 0.630188679245283,
"acc_norm_stderr": 0.02971142188010793
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.03257901482099834,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.03257901482099834
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.02490699045899257,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.02490699045899257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6838709677419355,
"acc_stderr": 0.026450874489042774,
"acc_norm": 0.6838709677419355,
"acc_norm_stderr": 0.026450874489042774
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164542,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164542
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5461538461538461,
"acc_stderr": 0.025242770987126184,
"acc_norm": 0.5461538461538461,
"acc_norm_stderr": 0.025242770987126184
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652459,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652459
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.031968769891957786,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.031968769891957786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659807,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659807
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.726605504587156,
"acc_stderr": 0.019109299846098292,
"acc_norm": 0.726605504587156,
"acc_norm_stderr": 0.019109299846098292
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.03149328104507957,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.03149328104507957
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.029312814153955924,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.029312814153955924
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.04039314978724561,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.04039314978724561
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.04742762361243011,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.04742762361243011
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.026655699653922737,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.026655699653922737
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7624521072796935,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.7624521072796935,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.02572280220089581,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.02572280220089581
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25139664804469275,
"acc_stderr": 0.01450897945355397,
"acc_norm": 0.25139664804469275,
"acc_norm_stderr": 0.01450897945355397
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.027363593284684965,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.027363593284684965
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6591639871382636,
"acc_stderr": 0.026920841260776165,
"acc_norm": 0.6591639871382636,
"acc_norm_stderr": 0.026920841260776165
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.02651759772446501,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.02651759772446501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255856,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255856
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40547588005215124,
"acc_stderr": 0.012539960672377202,
"acc_norm": 0.40547588005215124,
"acc_norm_stderr": 0.012539960672377202
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5661764705882353,
"acc_stderr": 0.030105636570016633,
"acc_norm": 0.5661764705882353,
"acc_norm_stderr": 0.030105636570016633
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.019873802005061177,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.019873802005061177
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.030021056238440307,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.030021056238440307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445414,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445414
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.0330140594698725,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.0330140594698725
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29498164014687883,
"mc1_stderr": 0.015964400965589664,
"mc2": 0.45008851442315223,
"mc2_stderr": 0.015144388624059283
},
"harness|winogrande|5": {
"acc": 0.7277032359905288,
"acc_stderr": 0.012510697991453934
},
"harness|gsm8k|5": {
"acc": 0.2266868840030326,
"acc_stderr": 0.011532758009339995
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
nanina1/spider | ---
license: mit
---
|
Englios/Wikipedia-Malaysian-Politicians | ---
language:
- en
---
# Summary
- wikipedia page : https://en.wikipedia.org/wiki/Category:Malaysian_politicians
- Number of Politicians : 110
- Null Images of Politicians : 16
- link to dataset : https://huggingface.co/datasets/Englios/Wikipedia-Malaysian-Politicians
- date of creation: 2024-20-01 |
shreyas1104/medical-intent-audio-dataset | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype: int64
splits:
- name: train
num_bytes: 5549933063.18
num_examples: 5895
- name: validation
num_bytes: 295375977.0
num_examples: 385
- name: test
num_bytes: 332239652.0
num_examples: 380
download_size: 4571560620
dataset_size: 6177548692.18
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
tmnam20/Vietnamese-News-raw | ---
license: cc-by-4.0
task_categories:
- text-generation
language:
- vi
--- |
CyberHarem/blue_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of blue (Pokémon)
This is the dataset of blue (Pokémon), containing 97 images and their tags.
The core tags of this character are `brown_hair, green_eyes, spiked_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 97 | 56.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blue_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 97 | 49.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blue_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 153 | 78.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blue_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 97 | 55.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blue_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 153 | 87.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blue_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/blue_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1boy, holding_poke_ball, male_focus, solo, necklace, poke_ball_(basic), jacket, smile |
| 1 | 5 |  |  |  |  |  | 1boy, bangs, grin, male_focus, necklace, short_hair, brown_eyes, long_sleeves, pokemon_(creature), purple_shirt, teeth, jacket, pants, poke_ball, boots, brown_footwear, holding, one_eye_closed, solo |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1boy | holding_poke_ball | male_focus | solo | necklace | poke_ball_(basic) | jacket | smile | bangs | grin | short_hair | brown_eyes | long_sleeves | pokemon_(creature) | purple_shirt | teeth | pants | poke_ball | boots | brown_footwear | holding | one_eye_closed |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------|:--------------------|:-------------|:-------|:-----------|:--------------------|:---------|:--------|:--------|:-------|:-------------|:-------------|:---------------|:---------------------|:---------------|:--------|:--------|:------------|:--------|:-----------------|:----------|:-----------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | X | X | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_222gate__TinyMistral-248Mx4-MOE | ---
pretty_name: Evaluation run of 222gate/TinyMistral-248Mx4-MOE
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [222gate/TinyMistral-248Mx4-MOE](https://huggingface.co/222gate/TinyMistral-248Mx4-MOE)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_222gate__TinyMistral-248Mx4-MOE\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T07:05:45.702729](https://huggingface.co/datasets/open-llm-leaderboard/details_222gate__TinyMistral-248Mx4-MOE/blob/main/results_2024-01-21T07-05-45.702729.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24830542907208702,\n\
\ \"acc_stderr\": 0.030471240073543585,\n \"acc_norm\": 0.24917865866615294,\n\
\ \"acc_norm_stderr\": 0.03128580366341738,\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.01502635482491078,\n \"mc2\": 0.4865533579688347,\n\
\ \"mc2_stderr\": 0.01667138127210037\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2235494880546075,\n \"acc_stderr\": 0.012174896631202607,\n\
\ \"acc_norm\": 0.295221843003413,\n \"acc_norm_stderr\": 0.013329750293382316\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2561242780322645,\n\
\ \"acc_stderr\": 0.00435599209003099,\n \"acc_norm\": 0.25712009559848636,\n\
\ \"acc_norm_stderr\": 0.004361529679492746\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.24342105263157895,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.24342105263157895,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2830188679245283,\n \"acc_stderr\": 0.027724236492700904,\n\
\ \"acc_norm\": 0.2830188679245283,\n \"acc_norm_stderr\": 0.027724236492700904\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2774566473988439,\n\
\ \"acc_stderr\": 0.03414014007044036,\n \"acc_norm\": 0.2774566473988439,\n\
\ \"acc_norm_stderr\": 0.03414014007044036\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n\
\ \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2127659574468085,\n \"acc_stderr\": 0.02675439134803976,\n\
\ \"acc_norm\": 0.2127659574468085,\n \"acc_norm_stderr\": 0.02675439134803976\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776575,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776575\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523812,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523812\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.267741935483871,\n\
\ \"acc_stderr\": 0.025189006660212378,\n \"acc_norm\": 0.267741935483871,\n\
\ \"acc_norm_stderr\": 0.025189006660212378\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.22660098522167488,\n \"acc_stderr\": 0.02945486383529297,\n\
\ \"acc_norm\": 0.22660098522167488,\n \"acc_norm_stderr\": 0.02945486383529297\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.13,\n \"acc_stderr\": 0.0337997668989631,\n \"acc_norm\"\
: 0.13,\n \"acc_norm_stderr\": 0.0337997668989631\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.03317505930009179,\n\
\ \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.03317505930009179\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20707070707070707,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.20707070707070707,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3471502590673575,\n \"acc_stderr\": 0.034356961683613546,\n\
\ \"acc_norm\": 0.3471502590673575,\n \"acc_norm_stderr\": 0.034356961683613546\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2743589743589744,\n \"acc_stderr\": 0.022622765767493214,\n\
\ \"acc_norm\": 0.2743589743589744,\n \"acc_norm_stderr\": 0.022622765767493214\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712173,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712173\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2815126050420168,\n \"acc_stderr\": 0.029213549414372146,\n\
\ \"acc_norm\": 0.2815126050420168,\n \"acc_norm_stderr\": 0.029213549414372146\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.036313298039696545,\n \"\
acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696545\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3100917431192661,\n \"acc_stderr\": 0.01983084968443975,\n \"\
acc_norm\": 0.3100917431192661,\n \"acc_norm_stderr\": 0.01983084968443975\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.20833333333333334,\n \"acc_stderr\": 0.027696910713093936,\n \"\
acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.027696910713093936\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23039215686274508,\n \"acc_stderr\": 0.029554292605695046,\n \"\
acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.029554292605695046\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25738396624472576,\n \"acc_stderr\": 0.0284588209914603,\n \
\ \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.0284588209914603\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.19282511210762332,\n\
\ \"acc_stderr\": 0.02647824096048936,\n \"acc_norm\": 0.19282511210762332,\n\
\ \"acc_norm_stderr\": 0.02647824096048936\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2066115702479339,\n \"acc_stderr\": 0.03695980128098825,\n \"\
acc_norm\": 0.2066115702479339,\n \"acc_norm_stderr\": 0.03695980128098825\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.2037037037037037,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.040598672469526864,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.040598672469526864\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.32038834951456313,\n \"acc_stderr\": 0.0462028408228004,\n\
\ \"acc_norm\": 0.32038834951456313,\n \"acc_norm_stderr\": 0.0462028408228004\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20512820512820512,\n\
\ \"acc_stderr\": 0.026453508054040356,\n \"acc_norm\": 0.20512820512820512,\n\
\ \"acc_norm_stderr\": 0.026453508054040356\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.14,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.14,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.02353292543104429,\n\
\ \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.02353292543104429\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n\
\ \"acc_stderr\": 0.01475690648326066,\n \"acc_norm\": 0.264804469273743,\n\
\ \"acc_norm_stderr\": 0.01475690648326066\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.02555316999182653,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.02555316999182653\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19935691318327975,\n\
\ \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.19935691318327975,\n\
\ \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2345679012345679,\n \"acc_stderr\": 0.02357688174400572,\n\
\ \"acc_norm\": 0.2345679012345679,\n \"acc_norm_stderr\": 0.02357688174400572\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24113475177304963,\n \"acc_stderr\": 0.025518731049537762,\n \
\ \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.025518731049537762\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2588005215123859,\n\
\ \"acc_stderr\": 0.01118610904656461,\n \"acc_norm\": 0.2588005215123859,\n\
\ \"acc_norm_stderr\": 0.01118610904656461\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3272058823529412,\n \"acc_stderr\": 0.028501452860396563,\n\
\ \"acc_norm\": 0.3272058823529412,\n \"acc_norm_stderr\": 0.028501452860396563\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24673202614379086,\n \"acc_stderr\": 0.0174408203674025,\n \
\ \"acc_norm\": 0.24673202614379086,\n \"acc_norm_stderr\": 0.0174408203674025\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.27755102040816326,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.27755102040816326,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\
\ \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.2537313432835821,\n\
\ \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n\
\ \"acc_stderr\": 0.031755547866299194,\n \"acc_norm\": 0.21084337349397592,\n\
\ \"acc_norm_stderr\": 0.031755547866299194\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.27485380116959063,\n \"acc_stderr\": 0.03424042924691582,\n\
\ \"acc_norm\": 0.27485380116959063,\n \"acc_norm_stderr\": 0.03424042924691582\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.01502635482491078,\n \"mc2\": 0.4865533579688347,\n\
\ \"mc2_stderr\": 0.01667138127210037\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5177584846093133,\n \"acc_stderr\": 0.014043619596174962\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/222gate/TinyMistral-248Mx4-MOE
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|arc:challenge|25_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|gsm8k|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hellaswag|10_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T07-05-45.702729.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T07-05-45.702729.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- '**/details_harness|winogrande|5_2024-01-21T07-05-45.702729.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T07-05-45.702729.parquet'
- config_name: results
data_files:
- split: 2024_01_21T07_05_45.702729
path:
- results_2024-01-21T07-05-45.702729.parquet
- split: latest
path:
- results_2024-01-21T07-05-45.702729.parquet
---
# Dataset Card for Evaluation run of 222gate/TinyMistral-248Mx4-MOE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [222gate/TinyMistral-248Mx4-MOE](https://huggingface.co/222gate/TinyMistral-248Mx4-MOE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_222gate__TinyMistral-248Mx4-MOE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T07:05:45.702729](https://huggingface.co/datasets/open-llm-leaderboard/details_222gate__TinyMistral-248Mx4-MOE/blob/main/results_2024-01-21T07-05-45.702729.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24830542907208702,
"acc_stderr": 0.030471240073543585,
"acc_norm": 0.24917865866615294,
"acc_norm_stderr": 0.03128580366341738,
"mc1": 0.24357405140758873,
"mc1_stderr": 0.01502635482491078,
"mc2": 0.4865533579688347,
"mc2_stderr": 0.01667138127210037
},
"harness|arc:challenge|25": {
"acc": 0.2235494880546075,
"acc_stderr": 0.012174896631202607,
"acc_norm": 0.295221843003413,
"acc_norm_stderr": 0.013329750293382316
},
"harness|hellaswag|10": {
"acc": 0.2561242780322645,
"acc_stderr": 0.00435599209003099,
"acc_norm": 0.25712009559848636,
"acc_norm_stderr": 0.004361529679492746
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.24342105263157895,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.24342105263157895,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2830188679245283,
"acc_stderr": 0.027724236492700904,
"acc_norm": 0.2830188679245283,
"acc_norm_stderr": 0.027724236492700904
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2774566473988439,
"acc_stderr": 0.03414014007044036,
"acc_norm": 0.2774566473988439,
"acc_norm_stderr": 0.03414014007044036
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793275,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793275
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2127659574468085,
"acc_stderr": 0.02675439134803976,
"acc_norm": 0.2127659574468085,
"acc_norm_stderr": 0.02675439134803976
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.0433913832257986,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.0433913832257986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776575,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776575
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523812,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523812
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.267741935483871,
"acc_stderr": 0.025189006660212378,
"acc_norm": 0.267741935483871,
"acc_norm_stderr": 0.025189006660212378
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22660098522167488,
"acc_stderr": 0.02945486383529297,
"acc_norm": 0.22660098522167488,
"acc_norm_stderr": 0.02945486383529297
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.13,
"acc_stderr": 0.0337997668989631,
"acc_norm": 0.13,
"acc_norm_stderr": 0.0337997668989631
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.03317505930009179,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.03317505930009179
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20707070707070707,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.20707070707070707,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3471502590673575,
"acc_stderr": 0.034356961683613546,
"acc_norm": 0.3471502590673575,
"acc_norm_stderr": 0.034356961683613546
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2743589743589744,
"acc_stderr": 0.022622765767493214,
"acc_norm": 0.2743589743589744,
"acc_norm_stderr": 0.022622765767493214
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712173,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712173
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2815126050420168,
"acc_stderr": 0.029213549414372146,
"acc_norm": 0.2815126050420168,
"acc_norm_stderr": 0.029213549414372146
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.036313298039696545,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.036313298039696545
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3100917431192661,
"acc_stderr": 0.01983084968443975,
"acc_norm": 0.3100917431192661,
"acc_norm_stderr": 0.01983084968443975
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.027696910713093936,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.027696910713093936
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.029554292605695046,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.029554292605695046
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.0284588209914603,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.0284588209914603
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.19282511210762332,
"acc_stderr": 0.02647824096048936,
"acc_norm": 0.19282511210762332,
"acc_norm_stderr": 0.02647824096048936
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2066115702479339,
"acc_stderr": 0.03695980128098825,
"acc_norm": 0.2066115702479339,
"acc_norm_stderr": 0.03695980128098825
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.040598672469526864,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.040598672469526864
},
"harness|hendrycksTest-management|5": {
"acc": 0.32038834951456313,
"acc_stderr": 0.0462028408228004,
"acc_norm": 0.32038834951456313,
"acc_norm_stderr": 0.0462028408228004
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.20512820512820512,
"acc_stderr": 0.026453508054040356,
"acc_norm": 0.20512820512820512,
"acc_norm_stderr": 0.026453508054040356
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.14,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.14,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.02353292543104429,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.02353292543104429
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.264804469273743,
"acc_stderr": 0.01475690648326066,
"acc_norm": 0.264804469273743,
"acc_norm_stderr": 0.01475690648326066
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.02555316999182653,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.02555316999182653
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19935691318327975,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.19935691318327975,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2345679012345679,
"acc_stderr": 0.02357688174400572,
"acc_norm": 0.2345679012345679,
"acc_norm_stderr": 0.02357688174400572
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.025518731049537762,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.025518731049537762
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2588005215123859,
"acc_stderr": 0.01118610904656461,
"acc_norm": 0.2588005215123859,
"acc_norm_stderr": 0.01118610904656461
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3272058823529412,
"acc_stderr": 0.028501452860396563,
"acc_norm": 0.3272058823529412,
"acc_norm_stderr": 0.028501452860396563
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24673202614379086,
"acc_stderr": 0.0174408203674025,
"acc_norm": 0.24673202614379086,
"acc_norm_stderr": 0.0174408203674025
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.27755102040816326,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.27755102040816326,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.03076944496729602,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.03076944496729602
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.031755547866299194,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.031755547866299194
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.27485380116959063,
"acc_stderr": 0.03424042924691582,
"acc_norm": 0.27485380116959063,
"acc_norm_stderr": 0.03424042924691582
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24357405140758873,
"mc1_stderr": 0.01502635482491078,
"mc2": 0.4865533579688347,
"mc2_stderr": 0.01667138127210037
},
"harness|winogrande|5": {
"acc": 0.5177584846093133,
"acc_stderr": 0.014043619596174962
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
russellsheep/drawbench-upsampled-zephyr-7b-alpha | ---
dataset_info:
features:
- name: Prompt
dtype: string
- name: Upsampled Prompt
dtype: string
- name: Category
dtype: string
splits:
- name: train
num_bytes: 88284
num_examples: 200
download_size: 52827
dataset_size: 88284
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
thercyl/BRK | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: float64
- name: Ticker
dtype: string
- name: Year
dtype: string
- name: Text
dtype: string
- name: Embedding
dtype: string
splits:
- name: train
num_bytes: 60382282
num_examples: 1731
download_size: 38342637
dataset_size: 60382282
---
# Dataset Card for "BRK"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
derexHf/MathInstructTop2K | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 5591391
num_examples: 2000
download_size: 2494977
dataset_size: 5591391
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Falah/catalogue_photography_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 114345
num_examples: 1000
download_size: 2050
dataset_size: 114345
---
# Dataset Card for "catalogue_photography_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
premio-ai/TheArabicPile_Poetry | ---
language:
- ar
license: cc-by-nc-4.0
task_categories:
- text-generation
dataset_info:
- config_name: dedup
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 311172546
num_examples: 61085
download_size: 154601576
dataset_size: 311172546
- config_name: original
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 287936720
num_examples: 61591
download_size: 152946428
dataset_size: 287936720
configs:
- config_name: dedup
data_files:
- split: train
path: dedup/train-*
- config_name: original
data_files:
- split: train
path: data/train-*
---
# The Arabic Pile

## Introduction:
The Arabic Pile is a comprehensive dataset meticulously designed to parallel the structure of The Pile and The Nordic Pile. Focused on the Arabic language, the dataset encompasses a vast array of linguistic nuances, incorporating both Modern Standard Arabic (MSA) and various Levantine, North African, and Egyptian dialects. Tailored for the training and fine-tuning of large language models, the dataset consists of 13 subsets, each uniquely crafted to cater to different linguistic domains.
## The Poetry Subset:
This dataset has a collection of Arabic poetry.
## Other Subsets:
1. premio-ai/TheArabicPile
2. premio-ai/TheArabicPile_Web
3. premio-ai/TheArabicPile_Lyrics
4. premio-ai/TheArabicPile_Reviews
5. premio-ai/TheArabicPile_Dialects
6. premio-ai/TheArabicPile_Mathematics
7. premio-ai/TheArabicPile_Conversational
8. premio-ai/TheArabicPile_Articles
9. premio-ai/TheArabicPile_Poetry
10. premio-ai/TheArabicPile_Medical
11. premio-ai/TheArabicPile_Miscellaneous
12. premio-ai/TheArabicPile_SocialMedia
13. premio-ai/TheArabicPile_Translations
14. premio-ai/TheArabicPile_Books
These subsets serve distinct purposes, ranging from mathematical content to conversational dialogue, medical texts, and more. Notably, there's a dedicated subset, "premio-ai/TheArabicPile_SocialMedia," emphasizing the inclusion of language commonly found in social media contexts.
## Dataset Description
* Curated by: Premio.AI team
* Language(s) (NLP): Arabic, multiple languages on the translation dataset.
* License: CC BY-NC 4.0 Deed - Non Commercial.
* For any commercial uses or licensing, please contact mo@premio.ai.
## Data Structure
The datasets are divided into two main subsets:
1. Original Subset: The raw data as collected from sources, without modifications.
2. Deduplication Subset: A filtered and cleaned version, enhancing usability for large language models by reducing redundancy and noise.
The Arabic Pile extends an invitation not only for training and fine-tuning large language models but also for diverse applications across linguistic domains. Whether for research, analysis, or other linguistic endeavors, The Arabic Pile stands as a rich resource for the exploration of Arabic language intricacies.
## Data Collection
Please refer to the paper for more details on our data collection procedures.
## Data Format
The dataset has one single column called text. The text should contain the required meta data and the body combined. This was done to make sure that it will be a good fit for direct training or fine-tuning of large language models.
Please note that the meta data might require to be repeated if your training context window won’t fit the entire body of text.
## Potential Bias
As with any large-scale dataset, The Arabic Pile is not immune to potential biases that may influence the training and performance of language models. It's crucial to transparently address these biases to ensure responsible usage and interpretation of the dataset. Here are some considerations:
1. Dialectal Imbalance: The dataset incorporates various Arabic dialects, with a focus on Levantine, North African, and Egyptian variants. However, there might be variations in the representation of these dialects, potentially leading to an imbalance in the training data.
2. Source Influence: Bias may arise from the sources of the original data. The dataset collects information from diverse platforms and domains, and biases inherent in those sources could transfer to the dataset.
3. Social Media Context: Some of our datasets have language from social media platforms and online platforms. This subset may introduce biases inherent in online discourse, such as informal language, colloquial expressions, and potential subjectivity in politics, religion or culture.
4. Genre and Domain Bias: Different subsets cater to distinct linguistic domains, such as medical texts, poetry, reviews, and more. Each domain carries its own linguistic characteristics, potentially leading to biases based on the genres represented.
## License Information for The Arabic Pile: No Commercial Use
The Arabic Pile is released under the Creative Commons Attribution-NonCommercial 4.0 International License (CC BY-NC 4.0). This license is designed to facilitate the open sharing and collaboration of the dataset while ensuring responsible and non-commercial usage.
Key Points of the License:
* Attribution (BY): Users are free to share, adapt, and build upon the dataset, even commercially, as long as they provide appropriate attribution to the dataset creators.
* Non-Commercial (NC): The dataset may not be used for commercial purposes. Any use for commercial gain requires explicit permission from the dataset creators.
* No Additional Restrictions: The license allows for maximum freedom of use, provided the terms of attribution and non-commercial use are adhered to.
How to Cite: When using The Arabic Pile in your work, please include a proper citation to acknowledge the dataset creators. A recommended citation can be found in the model card for easy reference.
License Deed: For a comprehensive understanding of the terms and conditions, please refer to the CC BY-NC 4.0 License Deed.
By adopting this license, we aim to foster a collaborative and open environment for the exploration and advancement of Arabic language understanding and natural language processing.
## Citation
When utilizing The Arabic Pile in your research, development, or other projects, we kindly request that you cite the dataset using the following format:
@article{alrefaie2024arabicpile,
author = {Mohamed Taher Alrefaie, Mahmoud Ibrahim Barbary, Ahmed Yasser Hassanein, Shiref Khaled Elhalawany, Karim Ashraf Elsayed, Ahmed Yasser },
title = {The Arabic Pile: A Large Scale Dataset of Diverse Text for Large Language Modeling},
year = {2024},
url = {https://huggingface.co/datasets/premio-ai/TheArabicPile}
}
|
iwecht/hard_captions | ---
dataset_info:
features:
- name: annID
dtype: int64
- name: caption
dtype: string
- name: score
dtype: int64
splits:
- name: train
num_bytes: 364027
num_examples: 5000
download_size: 200465
dataset_size: 364027
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "hard_captions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
confit/wmms | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: species
dtype: string
- name: label
dtype:
class_label:
names:
'0': Atlantic_Spotted_Dolphin
'1': Bearded_Seal
'2': Beluga,_White_Whale
'3': Bottlenose_Dolphin
'4': Bowhead_Whale
'5': Clymene_Dolphin
'6': Common_Dolphin
'7': False_Killer_Whale
'8': Fin,_Finback_Whale
'9': Frasers_Dolphin
'10': Grampus,_Rissos_Dolphin
'11': Harp_Seal
'12': Humpback_Whale
'13': Killer_Whale
'14': Leopard_Seal
'15': Long-Finned_Pilot_Whale
'16': Melon_Headed_Whale
'17': Minke_Whale
'18': Narwhal
'19': Northern_Right_Whale
'20': Pantropical_Spotted_Dolphin
'21': Ross_Seal
'22': Rough-Toothed_Dolphin
'23': Short-Finned_Pacific_Pilot_Whale
'24': Southern_Right_Whale
'25': Sperm_Whale
'26': Spinner_Dolphin
'27': Striped_Dolphin
'28': Walrus
'29': Weddell_Seal
'30': White-beaked_Dolphin
'31': White-sided_Dolphin
splits:
- name: train
num_bytes: 1179470284
num_examples: 1357
- name: test
num_bytes: 154350686
num_examples: 340
download_size: 1217429434
dataset_size: 1333820970
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
task_categories:
- audio-classification
tags:
- multiclass
size_categories:
- 1K<n<10K
---
# Watkins Marine Mammal Sound (WMMS) Database
Sound files on this website are free to download for personal or academic (not commercial) use.
Sound files and associated metadata are credited as follows: "Watkins Marine Mammal Sound Database, Woods Hole Oceanographic Institution and the New Bedford Whaling Museum."
Database could be found and downloaded from [here](https://archive.org/details/watkins_202104).
In this database version, the audio archive includes sounds of 32 species:
- Atlantic_Spotted_Dolphin
- Bearded_Seal
- Beluga,_White_Whale
- Bottlenose_Dolphin
- Bowhead_Whale
- Clymene_Dolphin
- Common_Dolphin
- False_Killer_Whale
- Fin,_Finback_Whale
- Frasers_Dolphin
- Grampus,_Rissos_Dolphin
- Harp_Seal
- Humpback_Whale
- Killer_Whale
- Leopard_Seal
- Long-Finned_Pilot_Whale
- Melon_Headed_Whale
- Minke_Whale
- Narwhal
- Northern_Right_Whale
- Pantropical_Spotted_Dolphin
- Ross_Seal
- Rough-Toothed_Dolphin
- Short-Finned_Pacific_Pilot_Whale
- Southern_Right_Whale
- Sperm_Whale
- Spinner_Dolphin
- Striped_Dolphin
- Walrus
- Weddell_Seal
- White-beaked_Dolphin
- White-sided_Dolphin
Since there is no official train/test split, we use 80% of the samples for training (1357) and the rest for testing (340). |
RogerB/kin_en_DigitalUmuganda | ---
dataset_info:
features:
- name: rw
dtype: string
- name: en
dtype: string
splits:
- name: train
num_bytes: 4550456
num_examples: 47824
download_size: 2836819
dataset_size: 4550456
---
# Dataset Card for "kin_en_DigitalUmuganda"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
# Dataset Information
The dataset was created by [DigitalUmuganda](https://huggingface.co/datasets/DigitalUmuganda/kinyarwanda-english-machine-translation-dataset/tree/main) for machine translation from Kinyarwanda to English |
beta-reduction/webcrawl-202401 | ---
license: cc-by-sa-3.0
---
|
zjysteven/WikiMIA_paraphrased_perturbed | ---
dataset_info:
features:
- name: input
dtype: string
- name: label
dtype: int64
splits:
- name: WikiMIA_length32_paraphrased
num_bytes: 163365
num_examples: 776
- name: WikiMIA_length64_paraphrased
num_bytes: 224644
num_examples: 542
- name: WikiMIA_length128_paraphrased
num_bytes: 206645
num_examples: 250
- name: WikiMIA_length32_perturbed
num_bytes: 1650773
num_examples: 7760
- name: WikiMIA_length64_perturbed
num_bytes: 2255354
num_examples: 5420
- name: WikiMIA_length128_perturbed
num_bytes: 2092896
num_examples: 2500
- name: WikiMIA_length32_paraphrased_perturbed
num_bytes: 1662467
num_examples: 7760
- name: WikiMIA_length64_paraphrased_perturbed
num_bytes: 2286059
num_examples: 5420
- name: WikiMIA_length128_paraphrased_perturbed
num_bytes: 2105242
num_examples: 2500
download_size: 3282711
dataset_size: 12647445
configs:
- config_name: default
data_files:
- split: WikiMIA_length32_paraphrased
path: data/WikiMIA_length32_paraphrased-*
- split: WikiMIA_length64_paraphrased
path: data/WikiMIA_length64_paraphrased-*
- split: WikiMIA_length128_paraphrased
path: data/WikiMIA_length128_paraphrased-*
- split: WikiMIA_length32_perturbed
path: data/WikiMIA_length32_perturbed-*
- split: WikiMIA_length64_perturbed
path: data/WikiMIA_length64_perturbed-*
- split: WikiMIA_length128_perturbed
path: data/WikiMIA_length128_perturbed-*
- split: WikiMIA_length32_paraphrased_perturbed
path: data/WikiMIA_length32_paraphrased_perturbed-*
- split: WikiMIA_length64_paraphrased_perturbed
path: data/WikiMIA_length64_paraphrased_perturbed-*
- split: WikiMIA_length128_paraphrased_perturbed
path: data/WikiMIA_length128_paraphrased_perturbed-*
license: mit
---
## 📘 WikiMIA paraphrased and perturbed versions
The WikiMIA dataset serves as a benchmark designed to evaluate membership inference attack (MIA) methods, specifically in detecting pretraining data from extensive large language models.
It is originally constructed by Shi et al. (see the [original data repo](https://huggingface.co/datasets/swj0419/WikiMIA) for more details).
- The authors studied a *paraphrased* setting in their paper, where instead of detecting verbatim training texts, the goal is to detect (slightly) paraphrased version. Unfortunately they didn't
release such data splits. Here we provide our paraphrased version, which is obtained by instructing ChatGPT to replace certain number of words without changing the original semantic meaning.
- We further provide perturbed versions of WikiMIA, which are necessary to run the Neighbor attack. Perturbed versions are obtained by perturbing each input sentence with masked language model.
For each input we have perturbed 10 times so you don't have to repeat this process yourself (which can be time consuming).
## 💻 Loading the datasets
To load the dataset:
```python
from datasets import load_dataset
LENGTH = 32
SPLIT_NAME = "paraphrased"
dataset = load_dataset("zjysteven/WikiMIA_paraphrased_perturbed", split=f"WikiMIA_length{LENGTH}_{SPLIT_NAME}")
```
* LENGTH: choose from `32, 64, 128`, which is the length of the input text.
* SPLIT_NAME: choose from `"paraphrased", "perturbed", "paraphrased_perturbed"`.
* *Label 0*: Refers to the unseen (non-training) data during pretraining. *Label 1*: Refers to the seen (training) data.
## 🛠️ Codebase
For more details on evaluating multiple MIA methods on these WikiMIA datasets, visit our [GitHub repository](https://github.com/zjysteven/mink-plus-plus), where we also propose
a novel method, **Min-K%++**, that significantly outperforms both the Min-K% by Shi et al. and other baseline methods.
## ⭐ Citing our Work
If you find our codebase and datasets beneficial, kindly cite our work and the original WikiMIA:
```bibtex
@misc{zhang2024mink,
title={Min-K%++: Improved Baseline for Detecting Pre-Training Data from Large Language Models},
author={Jingyang Zhang and Jingwei Sun and Eric Yeats and Yang Ouyang and Martin Kuo and Jianyi Zhang and Hao Yang and Hai Li},
year={2024},
eprint={2404.02936},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@inproceedings{
shi2024detecting,
title={Detecting Pretraining Data from Large Language Models},
author={Weijia Shi and Anirudh Ajith and Mengzhou Xia and Yangsibo Huang and Daogao Liu and Terra Blevins and Danqi Chen and Luke Zettlemoyer},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=zWqr3MQuNs}
}
``` |
seansullivan/automation-txt | ---
license: other
license_name: me
license_link: LICENSE
---
|
kevinassogba/frbam-llama2 | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 898028
num_examples: 3000
- name: test
num_bytes: 300470
num_examples: 1000
- name: val
num_bytes: 317907
num_examples: 1000
download_size: 981949
dataset_size: 1516405
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: val
path: data/val-*
---
|
CyberHarem/sumire_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sumire/乙花スミレ/菫 (Blue Archive)
This is the dataset of sumire/乙花スミレ/菫 (Blue Archive), containing 92 images and their tags.
The core tags of this character are `long_hair, ponytail, breasts, black_hair, purple_eyes, hair_ornament, hair_flower, scrunchie, purple_scrunchie, very_long_hair, hair_scrunchie, large_breasts, halo, sidelocks, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 92 | 141.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sumire_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 92 | 119.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sumire_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 238 | 244.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sumire_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sumire_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, bare_shoulders, black_pants, crop_top, flower, green_choker, midriff, navel, solo, stomach, white_sports_bra, cowboy_shot, looking_at_viewer, yoga_pants, cleavage, simple_background, blush, bottle, closed_mouth, high_ponytail, holding_towel, parted_lips, sweatband, white_background, wristband |
| 1 | 5 |  |  |  |  |  | 1girl, black_pants, blush, eyewear_hang, flower, long_sleeves, looking_at_viewer, midriff, open_jacket, solo, white_sports_bra, yoga_pants, bare_shoulders, collarbone, green_choker, navel, off_shoulder, stomach, sunglasses, unworn_eyewear, white_jacket, cleavage, cowboy_shot, open_mouth, parted_lips, standing, sweat, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_pants | crop_top | flower | green_choker | midriff | navel | solo | stomach | white_sports_bra | cowboy_shot | looking_at_viewer | yoga_pants | cleavage | simple_background | blush | bottle | closed_mouth | high_ponytail | holding_towel | parted_lips | sweatband | white_background | wristband | eyewear_hang | long_sleeves | open_jacket | collarbone | off_shoulder | sunglasses | unworn_eyewear | white_jacket | open_mouth | standing | sweat |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------|:-----------|:---------|:---------------|:----------|:--------|:-------|:----------|:-------------------|:--------------|:--------------------|:-------------|:-----------|:--------------------|:--------|:---------|:---------------|:----------------|:----------------|:--------------|:------------|:-------------------|:------------|:---------------|:---------------|:--------------|:-------------|:---------------|:-------------|:-----------------|:---------------|:-------------|:-----------|:--------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | | X | | | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X |
|
Bsbell21/MarketMailAI180 | ---
dataset_info:
features:
- name: product
dtype: string
- name: description
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 109776.75
num_examples: 135
- name: test
num_bytes: 36592.25
num_examples: 45
download_size: 89875
dataset_size: 146369.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
distil-whisper/rev16 | ---
dataset_info:
- config_name: full
features:
- name: audio
dtype: audio
- name: file_number
dtype: string
- name: show_title
dtype: string
- name: episode_title
dtype: string
- name: itunes_id
dtype: string
- name: transcription
dtype: string
splits:
- name: test
num_bytes: 1509910660.0
num_examples: 30
download_size: 1445493754
dataset_size: 1509910660.0
- config_name: whisper_subset
features:
- name: audio
dtype: audio
- name: file_number
dtype: string
- name: show_title
dtype: string
- name: episode_title
dtype: string
- name: itunes_id
dtype: string
- name: transcription
dtype: string
splits:
- name: test
num_bytes: 921693242.0
num_examples: 16
download_size: 881542397
dataset_size: 921693242.0
configs:
- config_name: full
data_files:
- split: test
path: full/test-*
- config_name: whisper_subset
data_files:
- split: test
path: whisper_subset/test-*
---
# Dataset Card for "rev16"
Configs:
* `full`: the entire 30 podcast files
* `whisper_subset`: the subset of 16 podcast files used in the Whisper paper for long-form evaluation. The remaining 14 files have mis-matches between the audio and labels, and are thus filtered from the test set. |
MohammadJamalaldeen/google_fleurs_ar | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2021551248
num_examples: 2104
- name: test
num_bytes: 411235560
num_examples: 428
download_size: 901727231
dataset_size: 2432786808
---
# Dataset Card for "google_fleurs_ar"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huggingartists/the-69-eyes | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/the-69-eyes"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.162381 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/9e0451fa9d3f8cf38aa11994dbd934a8.600x600x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/the-69-eyes">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">The 69 Eyes</div>
<a href="https://genius.com/artists/the-69-eyes">
<div style="text-align: center; font-size: 14px;">@the-69-eyes</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/the-69-eyes).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/the-69-eyes")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|168| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/the-69-eyes")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
CyberHarem/perfumer_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of perfumer/パフューマー/调香师 (Arknights)
This is the dataset of perfumer/パフューマー/调香师 (Arknights), containing 229 images and their tags.
The core tags of this character are `animal_ears, brown_hair, fox_ears, ponytail, bow, hair_bow, brown_eyes, long_hair, blue_bow, fox_girl, breasts, tail, striped_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 229 | 341.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/perfumer_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 229 | 295.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/perfumer_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 564 | 575.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/perfumer_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/perfumer_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, bare_shoulders, long_sleeves, off_shoulder, open_jacket, solo, white_dress, looking_at_viewer, blue_jacket, collarbone, holding_staff, smile, fur_trim, cowboy_shot, white_flower |
| 1 | 9 |  |  |  |  |  | 1girl, bare_shoulders, holding_staff, looking_at_viewer, off_shoulder, solo, fox_tail, long_sleeves, open_jacket, white_dress, blue_jacket, full_body, high_heels, simple_background, white_background, black_footwear, collarbone, frilled_dress, smile, animal, black_jacket, white_flower, fur_trim |
| 2 | 7 |  |  |  |  |  | 1girl, black_pantyhose, fox_tail, long_sleeves, white_shirt, belt, id_card, solo, closed_mouth, collared_shirt, holding, looking_at_viewer, official_alternate_costume, smile, black_footwear, full_body, shoes, simple_background, vial, white_background, white_jacket, black_skirt, cowboy_shot, hand_up |
| 3 | 6 |  |  |  |  |  | 1girl, dress, hairband, long_sleeves, official_alternate_costume, solo, closed_mouth, hair_between_eyes, looking_at_viewer, simple_background, smile, yellow_bow, fox, holding_book, open_book, sitting, upper_body, white_background |
| 4 | 5 |  |  |  |  |  | 1girl, blush, completely_nude, fox_tail, cum_in_pussy, medium_breasts, nipples, sweat, 1boy, anus, hetero, looking_at_viewer, lying, mosaic_censoring, open_mouth, ass, collarbone, cum_overflow, hair_between_eyes, heart-shaped_pupils, looking_back, on_bed, penis, sex_from_behind, solo_focus, spread_legs, vaginal |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | long_sleeves | off_shoulder | open_jacket | solo | white_dress | looking_at_viewer | blue_jacket | collarbone | holding_staff | smile | fur_trim | cowboy_shot | white_flower | fox_tail | full_body | high_heels | simple_background | white_background | black_footwear | frilled_dress | animal | black_jacket | black_pantyhose | white_shirt | belt | id_card | closed_mouth | collared_shirt | holding | official_alternate_costume | shoes | vial | white_jacket | black_skirt | hand_up | dress | hairband | hair_between_eyes | yellow_bow | fox | holding_book | open_book | sitting | upper_body | blush | completely_nude | cum_in_pussy | medium_breasts | nipples | sweat | 1boy | anus | hetero | lying | mosaic_censoring | open_mouth | ass | cum_overflow | heart-shaped_pupils | looking_back | on_bed | penis | sex_from_behind | solo_focus | spread_legs | vaginal |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:---------------|:--------------|:-------|:--------------|:--------------------|:--------------|:-------------|:----------------|:--------|:-----------|:--------------|:---------------|:-----------|:------------|:-------------|:--------------------|:-------------------|:-----------------|:----------------|:---------|:---------------|:------------------|:--------------|:-------|:----------|:---------------|:-----------------|:----------|:-----------------------------|:--------|:-------|:---------------|:--------------|:----------|:--------|:-----------|:--------------------|:-------------|:------|:---------------|:------------|:----------|:-------------|:--------|:------------------|:---------------|:-----------------|:----------|:--------|:-------|:-------|:---------|:--------|:-------------------|:-------------|:------|:---------------|:----------------------|:---------------|:---------|:--------|:------------------|:-------------|:--------------|:----------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | X | | | X | | X | | | | X | | X | | X | X | | X | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | X | | | X | | X | | | | X | | | | | | | X | X | | | | | | | | | X | | | X | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | | | | X | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
datadreamer-dev/abstracts_and_tweets | ---
dataset_info:
features:
- name: abstracts
dtype: string
- name: prompts
dtype: string
- name: tweets
dtype: string
splits:
- name: train
num_bytes: 3127163
num_examples: 900
- name: validation
num_bytes: 343839
num_examples: 100
download_size: 1765300
dataset_size: 3471002
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
library_name: datadreamer
size_categories:
- 1K<n<10K
tags:
- datadreamer
- datadreamer-0.1.0
- synthetic
- gpt-4
- gpt-4
---
# Dataset Card
This a synthetic dataset of arXiv-style research paper abstracts and tweets summarizing them used as a demonstration of the [DataDreamer 🤖💤 library](https://datadreamer.dev/docs/latest/). It was used to train an ["Abstract to Tweet" model](https://huggingface.co/datadreamer-dev/abstracts_to_tweet_model).
---
This dataset was produced with [DataDreamer 🤖💤](https://datadreamer.dev). The synthetic dataset card can be found [here](datadreamer.json). |
emmas96/Lenselink | ---
license: gpl-3.0
---
|
automated-research-group/llama2_7b-arc_hard-results_playing | ---
dataset_info:
config_name: '{''do_sample''=False, ''beams''=1}'
features:
- name: id
dtype: string
- name: prediction
dtype: string
- name: bool_accuracy
dtype: bool
splits:
- name: train
num_bytes: 11410
num_examples: 299
download_size: 9803
dataset_size: 11410
configs:
- config_name: '{''do_sample''=False, ''beams''=1}'
data_files:
- split: train
path: '{''do_sample''=False, ''beams''=1}/train-*'
---
# Dataset Card for "llama2_7b-arc_hard-results_playing"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
harishvs/imdb_review_prompt_small | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': neg
'1': pos
splits:
- name: train
num_bytes: 145038
num_examples: 391
- name: test
num_bytes: 152454
num_examples: 429
download_size: 147356
dataset_size: 297492
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
cowlag/sd-webui-config | ---
license: unknown
---
|
ilhamxx/my_data_receipt | ---
license: unknown
---
|
afasafen/mydataset | ---
license: afl-3.0
---
|
sourcegraph/fine-tune-unit-test-call-exp-context-dataset-java | ---
dataset_info:
features:
- name: repo_url
dtype: string
- name: language
dtype: string
- name: source_file_path
dtype: string
- name: test_file_path
dtype: string
- name: source_fn_block
dtype: string
- name: source_fn_name
dtype: string
- name: test_fn_block
dtype: string
- name: test_fn_name
dtype: string
- name: source_fn_call_exps
sequence: string
- name: test_fn_call_exps
sequence: string
- name: test_file_additional_context
struct:
- name: class_fields
dtype: string
- name: class_name
dtype: string
- name: source_file_additional_context
struct:
- name: class_fields
dtype: string
- name: class_name
dtype: string
- name: method_signatures
dtype: string
- name: prompt
list:
- name: content
dtype: string
- name: role
dtype: string
- name: response
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 992106219
num_examples: 99977
- name: validation
num_bytes: 130304452
num_examples: 12807
- name: test
num_bytes: 140926073
num_examples: 13825
download_size: 234465582
dataset_size: 1263336744
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
crumb/askmistral-pile-011-filtered | ---
dataset_info:
features:
- name: text
dtype: string
- name: pos
dtype: float64
splits:
- name: train
num_bytes: 5200925105.26056
num_examples: 669990
download_size: 3017098561
dataset_size: 5200925105.26056
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BangumiBase/deatte5byoudebattle | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Deatte 5-byou De Battle
This is the image base of bangumi Deatte 5-byou de Battle, we detected 30 characters, 2195 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 143 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 94 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 34 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 24 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 127 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 34 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 29 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 22 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 68 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 81 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 42 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 12 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 61 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 127 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 67 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 10 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 25 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 12 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 105 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 52 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 5 | [Download](20/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 21 | 74 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 27 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 20 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 7 | [Download](24/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 25 | 297 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 7 | [Download](26/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 27 | 17 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 27 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 545 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
XYLF/autotrain-data-flan-t5-tuning | ---
task_categories:
- translation
---
# AutoTrain Dataset for project: flan-t5-tuning
## Dataset Description
This dataset has been automatically processed by AutoTrain for project flan-t5-tuning.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"target": "G(!( oGupGnpHlFihSN ))",
"source": "it never happens that oGupGnpHlFihSN",
"feat_Unnamed: 2": null
},
{
"target": "G(!( uJwMVmQcOjk & NFbgbwYf & uwbnvOQXgDVD ))",
"source": "at no time uJwMVmQcOjk and, at the same time, NFbgbwYf and uwbnvOQXgDVD",
"feat_Unnamed: 2": null
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"target": "Value(dtype='string', id=None)",
"source": "Value(dtype='string', id=None)",
"feat_Unnamed: 2": "Value(dtype='float64', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 2399 |
| valid | 600 |
|
MLRS/masri_synthetic | ---
annotations_creators:
- machine-generated
language:
- mt
language_creators:
- machine-generated
license: cc-by-nc-sa-4.0
multilinguality:
- monolingual
pretty_name: "MASRI-SYNTHETIC: Synthetized Speech with Transcriptions in Maltese."
size_categories:
- 10K<n<100K
source_datasets:
- original
tags:
- masri
- maltese
- masri-project
- malta
- synthetic speech
- tts
task_categories:
- automatic-speech-recognition
task_ids: []
---
# Dataset Card for masri_synthetic
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [MASRI Project](https://www.um.edu.mt/projects/masri/)
- **Repository:** [MASRI Data Repo](https://github.com/UMSpeech/)
- **Repository:** [LDC](https://catalog.ldc.upenn.edu/LDC2022S08)
- **Paper:** [Data Augmentation for Speech Recognition in Maltese: A Low-Resource Perspective](https://www.um.edu.mt/library/oar/bitstream/123456789/92466/1/Data_Augmentation_for_Speech_Recognition_in_Maltese_A_Low_Resource_Perspective%282021%29.pdf)
- **Paper:** [Analysis of Data Augmentation Methods for Low-Resource Maltese ASR](https://arxiv.org/pdf/2111.07793.pdf)
### Dataset Summary
The MASRI-SYNTHETIC is a corpus made out of synthesized speech in Maltese. The text-to-speech (TTS) system utilized to produce the utterances was developed by the Research & Development Department of Crimsonwing p.l.c.
The sentences used to create the corpus were extracted from the [MLRS Corpus](https://mlrs.research.um.edu.mt/index.php?page=corpora), which is a corpus of written or transcribed Maltese divided into different genres, including: culture, news, academic, religion, sports, etc.
[MASRI](https://www.um.edu.mt/projects/masri/) stands for "Maltese Automatic Speech Recognition I". [MASRI](https://www.um.edu.mt/projects/masri/) is a project at the [University of Malta](https://www.um.edu.mt/), funded by the University of Malta Research Fund Award Scheme.
### Example Usage
The MASRI-SYNTHETIC contains the train split only:
```python
from datasets import load_dataset
masri_synthetic = load_dataset("MLRS/masri_synthetic")
```
It is also valid to do:
```python
from datasets import load_dataset
masri_synthetic = load_dataset("MLRS/masri_synthetic",split="train")
```
### Supported Tasks
automatic-speech-recognition: The dataset can be used to train a model for Automatic Speech Recognition (ASR). The model is presented with an audio file and asked to transcribe the audio file to written text. The most common evaluation metric is the word error rate (WER).
### Languages
The language of the corpus is Maltese.
## Dataset Structure
### Data Instances
```python
{
'audio_id': 'MSRSY_F_0042_RN01PP10_0143',
'audio': {
'path': '/home/carlos/.cache/HuggingFace/datasets/downloads/extracted/17d8c60020489a5a43ba0cf322ed7c121375915c671b57fbdb03950befbd1a9c/female/F_0042_RN01PP10/MSRSY_F_0042_RN01PP10_0143.flac',
'array': array([0., 0., 0., ..., 0., 0., 0.], dtype=float32),
'sampling_rate': 16000
},
'speaker_id': 'F_0042',
'gender': 'female',
'duration': 9.0,
'speech_rate': '-01',
'pitch': '+10',
'normalized_text': "il-poplu b' pakkett ta' negozjati f' id-direttur ġenerali tal-uffiċċju tal-pubblikazzjonijiet uffiċjali għall-komunitajiet ewropej"
}
```
### Data Fields
* `audio_id` (string) - id of audio segment
* `audio` (datasets.Audio) - a dictionary containing the path to the audio, the decoded audio array, and the sampling rate. In non-streaming mode (default), the path points to the locally extracted audio. In streaming mode, the path is the relative path of an audio inside its archive (as files are not downloaded and extracted locally).
* `speaker_id` (string) - id of the synthetic voice
* `gender` (string) - gender of synthetic voice (male or female)
* `duration` (float32) - duration of the audio file in seconds.
* `speech_rate` (string) - speed rate that goes from -2 to +2.
* `pitch` (string) - the pitch goes from -10 to +10.
* `normalized_text` (string) - normalized audio segment transcription
### Data Splits
The corpus counts just with the train split which has a total of 52500 speech files from 105 male and 105 female voices with a total duration of 99 hours and 18 minutes.
## Dataset Creation
### Curation Rationale
The MASRI-SYNTHETIC CORPUS (MSYC) has the following characteristics:
* The MSYC has an exact duration of 99 hours and 18 minutes. It has 52500 audio files.
* The MSYC has recordings from 210 different voices: 105 men and 105 female voices.
* Voices were produced when varying between 21 values of pitch (-10 to +10) and 5 values of speech rate (-2 to 2).
* Data in MSYC is classified by voice. It means, all the utterances belonging to one single voice are stored in one single directory.
* Data is also classified according to the gender (male/female) of the voice.
* Each voice has assigned 250 utterances of 13 words each.
* Every audio file in the MSYC has a duration between 2 and 10 seconds approximately.
* Audio files in the MSYC are distributed in a 16khz@16bit mono format.
* Transcriptions in MSYC are in lowercase. No punctuation marks are permitted except for dashes (-) and apostrophes (') due to their importance in Maltese orthography.
* Every audio file has an ID that is compatible with ASR engines such as Kaldi and CMU-Sphinx.
### Source Data
#### Initial Data Collection and Normalization
The MASRI-SYNTHETIC CORPUS was possible thanks to the text-to-speech (TTS) system developed by the Research & Development Department of Crimsonwing p.l.c.
The sentences used to create the corpus were extracted from the [MLRS Corpus](https://mlrs.research.um.edu.mt/index.php?page=corpora).
### Annotations
#### Annotation process
Text sentences from the platform [MLRS Corpus](https://mlrs.research.um.edu.mt/index.php?page=corpora) were selected to create synthetic utterances with them. The MASRI-SYNTHETIC is comprised of synthetic utterances only.
#### Who are the annotators?
The authors selected the sentences to be synthesized.
### Personal and Sensitive Information
The corpus is comprised of synthetic speech utterances from a TTS system. No personal or sensitive information is shared.
## Considerations for Using the Data
### Social Impact of Dataset
The MASRI-SYNTHETIC CORPUS is the only Maltese corpus at the moment, that counts with synthetic speech and it is publicly available under a CC-BY-NC-SA-4.0 license.
### Discussion of Biases
* Sentences from [MLRS]((https://mlrs.research.um.edu.mt/index.php?page=corpora)) are put in a single plain text file. The text includes punctuation marks.
* To facilitate the text processing, sentences are split to fit into lines with 30 words only.
* Punctuation marks and sentences including not UTF-8 characters are removed.
* Sentences with foreign words and proper names were removed.
* As the letters "c" and "y" do not really belong to the Maltese alphabet, sentences including words with any of those letters were removed. This is done to ensure that only Maltese words will be included in each sentence.
* Using Python, the resulting sentences are now put into a simple list; so, each element is a word.
* Each word of the list is now taken one by one to produce text lines of exactly 13 words. This process only generated 27714 sentences of the 52500 that constitute the whole corpus.
* To produce the remaining sentences, the words of the list were shuffled and the process in the previous point were repeated until we got the 52500 sentences needed by the corpus.
* At the end, the produced sentences were converted into utterances using the TTS system.
### Other Known Limitations
The MASRI team does not guarantee the accuracy of this corpus, nor its suitability for any specific purpose. In fact, we expect a number of errors, omissions and inconsistencies to remain in the corpus.
### Dataset Curators
The speech sentences were selected and synthesized by [Carlos Daniel Hernández Mena](https://huggingface.co/carlosdanielhernandezmena) at the [University of Malta](https://www.um.edu.mt/) in the Msida Campus during June, 2020.
### Licensing Information
[CC-BY-NC-SA-4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/)
### Citation Information
```
@misc{carlosmenamasrisynthetic2020,
title={MASRI-SYNTHETIC: Synthetized Speech with Transcriptions in Maltese.},
author={Hernandez Mena, Carlos Daniel and Gatt, Albert and DeMarco, Andrea and Borg, Claudia and van der Plas, Lonneke},
journal={MASRI Project, Malta},
year={2020},
url={https://huggingface.co/datasets/MLRS/masri_synthetic},
}
```
The MASRI-SYNTHETIC was also published at [LDC](https://catalog.ldc.upenn.edu/LDC2022S08) in 2022.
### Contributions
The authors would like to thank to KPMG Microsoft Business Solutions (formerly CrimsonWing) for
providing the TTS system used in our experiments. For more information about the CrimsonWing TTS system see [this presentation](https://pdfs.semanticscholar.org/5e5a/25e34b3c351ba0e58211a5192535e9ddea06.pdf).
We also want to thant to the University of Malta Research Fund Award Scheme for making this project possible.
|
SiberiaSoft/SiberianDataset | ---
license: mit
task_categories:
- text-generation
- text2text-generation
- conversational
language:
- ru
size_categories:
- 100K<n<1M
---
### SiberiaSoft/SiberianDataset
Датасет инструкций, диалогов, QA
## Процентное содержание задач:
| Задача | Процентное содержание |
|:----------------------------------------------------------------------------:|:---------------------:|
| Чит-чат с контекстом | 40.092% |
| Чит-чат без контекста (синтетика) | 15.391% |
| QA с короткими ответами | 14.045% |
| Инструкции с its5Q/yandex-q | 6.292% |
| Инструкции с Den4ikAI/russian_instructions_2 | 4.568% |
| Инструкции с lksy/ru_instruct_gpt4 (жестко очищенные) | 4.492% |
| Инструкции с IlyaGusev/ru_turbo_alpaca_evol_instruct (очень жестко очищенные)| 4.442% |
| QA с длинными, развернутыми ответами | 4.441% |
| QA с использованием Wikipedia | 3.617% |
| Ответы на вопросы по тексту Den4ikAI/ru_sberquad_long_answers | 2.448% |
| Решение проблем | 0.14% |
| QA Объясни ребенку | 0.034% |
### Citation
```
@MISC{SiberianDataset,
author = {Denis Petrov, Ivan Ramovich},
title = {Russian dataset for Instruct/Chat models},
url = {https://huggingface.co/datasets/SiberiaSoft/SiberianDataset},
year = 2023
}
``` |
sinandraide/hotpot_qa_spread | ---
task_categories:
- question-answering
language:
- en
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
This dataset is a spread version of the HotpotQA dataset. This version allows it to be compatible with Langchain's HuggingfaceLoader.
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
The source data set is from https://huggingface.co/datasets/hotpot_qa. The original authors are Yang et al. (2018) https://arxiv.org/abs/1809.09600.
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_athirdpath__Iambe-20b-DARE-v2 | ---
pretty_name: Evaluation run of athirdpath/Iambe-20b-DARE-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [athirdpath/Iambe-20b-DARE-v2](https://huggingface.co/athirdpath/Iambe-20b-DARE-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_athirdpath__Iambe-20b-DARE-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-08T02:48:17.586217](https://huggingface.co/datasets/open-llm-leaderboard/details_athirdpath__Iambe-20b-DARE-v2/blob/main/results_2023-12-08T02-48-17.586217.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6035804809886537,\n\
\ \"acc_stderr\": 0.03294194113186395,\n \"acc_norm\": 0.608982387572558,\n\
\ \"acc_norm_stderr\": 0.0336160701060513,\n \"mc1\": 0.390452876376989,\n\
\ \"mc1_stderr\": 0.01707823074343145,\n \"mc2\": 0.5385363923413744,\n\
\ \"mc2_stderr\": 0.01567101081137168\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6023890784982935,\n \"acc_stderr\": 0.014301752223279538,\n\
\ \"acc_norm\": 0.6279863481228669,\n \"acc_norm_stderr\": 0.014124597881844456\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6562437761402111,\n\
\ \"acc_stderr\": 0.004739902411944536,\n \"acc_norm\": 0.8453495319657439,\n\
\ \"acc_norm_stderr\": 0.0036083220651418873\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365245,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365245\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.038047497443647646,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.038047497443647646\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077615,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077615\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.024419234966819067,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.024419234966819067\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7161290322580646,\n\
\ \"acc_stderr\": 0.02564938106302926,\n \"acc_norm\": 0.7161290322580646,\n\
\ \"acc_norm_stderr\": 0.02564938106302926\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.031156269519646836,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.031156269519646836\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630644,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630644\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.02469721693087894,\n \
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.02469721693087894\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473072,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473072\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.031357095996135904,\n\
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.031357095996135904\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"\
acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7761467889908257,\n \"acc_stderr\": 0.017871217767790222,\n \"\
acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.017871217767790222\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389087,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389087\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326468,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326468\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7905491698595147,\n\
\ \"acc_stderr\": 0.014551310568143705,\n \"acc_norm\": 0.7905491698595147,\n\
\ \"acc_norm_stderr\": 0.014551310568143705\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069716,\n\
\ \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069716\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48268156424581005,\n\
\ \"acc_stderr\": 0.016712467441702523,\n \"acc_norm\": 0.48268156424581005,\n\
\ \"acc_norm_stderr\": 0.016712467441702523\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.027057974624494382,\n\
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.027057974624494382\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.02532988817190092,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.02532988817190092\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n\
\ \"acc_stderr\": 0.012749206007657476,\n \"acc_norm\": 0.47131681877444587,\n\
\ \"acc_norm_stderr\": 0.012749206007657476\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5808823529411765,\n \"acc_stderr\": 0.029972807170464626,\n\
\ \"acc_norm\": 0.5808823529411765,\n \"acc_norm_stderr\": 0.029972807170464626\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6160130718954249,\n \"acc_stderr\": 0.019675808135281515,\n \
\ \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.019675808135281515\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982062,\n\
\ \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982062\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.03015113445777634,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.03015113445777634\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.390452876376989,\n\
\ \"mc1_stderr\": 0.01707823074343145,\n \"mc2\": 0.5385363923413744,\n\
\ \"mc2_stderr\": 0.01567101081137168\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838229\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.332827899924185,\n \
\ \"acc_stderr\": 0.012979892496598268\n }\n}\n```"
repo_url: https://huggingface.co/athirdpath/Iambe-20b-DARE-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|arc:challenge|25_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|gsm8k|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hellaswag|10_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T02-48-17.586217.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-08T02-48-17.586217.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- '**/details_harness|winogrande|5_2023-12-08T02-48-17.586217.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-08T02-48-17.586217.parquet'
- config_name: results
data_files:
- split: 2023_12_08T02_48_17.586217
path:
- results_2023-12-08T02-48-17.586217.parquet
- split: latest
path:
- results_2023-12-08T02-48-17.586217.parquet
---
# Dataset Card for Evaluation run of athirdpath/Iambe-20b-DARE-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/athirdpath/Iambe-20b-DARE-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [athirdpath/Iambe-20b-DARE-v2](https://huggingface.co/athirdpath/Iambe-20b-DARE-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_athirdpath__Iambe-20b-DARE-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-08T02:48:17.586217](https://huggingface.co/datasets/open-llm-leaderboard/details_athirdpath__Iambe-20b-DARE-v2/blob/main/results_2023-12-08T02-48-17.586217.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6035804809886537,
"acc_stderr": 0.03294194113186395,
"acc_norm": 0.608982387572558,
"acc_norm_stderr": 0.0336160701060513,
"mc1": 0.390452876376989,
"mc1_stderr": 0.01707823074343145,
"mc2": 0.5385363923413744,
"mc2_stderr": 0.01567101081137168
},
"harness|arc:challenge|25": {
"acc": 0.6023890784982935,
"acc_stderr": 0.014301752223279538,
"acc_norm": 0.6279863481228669,
"acc_norm_stderr": 0.014124597881844456
},
"harness|hellaswag|10": {
"acc": 0.6562437761402111,
"acc_stderr": 0.004739902411944536,
"acc_norm": 0.8453495319657439,
"acc_norm_stderr": 0.0036083220651418873
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365245,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365245
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.038047497443647646,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.038047497443647646
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077615,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077615
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.024419234966819067,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.024419234966819067
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7161290322580646,
"acc_stderr": 0.02564938106302926,
"acc_norm": 0.7161290322580646,
"acc_norm_stderr": 0.02564938106302926
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.031156269519646836,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.031156269519646836
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.02541634309630644,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.02541634309630644
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.02469721693087894,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.02469721693087894
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473072,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473072
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.031357095996135904,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.031357095996135904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.039955240076816806,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.039955240076816806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.017871217767790222,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.017871217767790222
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389087,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389087
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326468,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326468
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7905491698595147,
"acc_stderr": 0.014551310568143705,
"acc_norm": 0.7905491698595147,
"acc_norm_stderr": 0.014551310568143705
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069716,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48268156424581005,
"acc_stderr": 0.016712467441702523,
"acc_norm": 0.48268156424581005,
"acc_norm_stderr": 0.016712467441702523
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.027057974624494382,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.027057974624494382
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.02532988817190092,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.02532988817190092
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.012749206007657476,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.012749206007657476
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5808823529411765,
"acc_stderr": 0.029972807170464626,
"acc_norm": 0.5808823529411765,
"acc_norm_stderr": 0.029972807170464626
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.019675808135281515,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.019675808135281515
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982062,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982062
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786848,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786848
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.03015113445777634,
"acc_norm": 0.9,
"acc_norm_stderr": 0.03015113445777634
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.390452876376989,
"mc1_stderr": 0.01707823074343145,
"mc2": 0.5385363923413744,
"mc2_stderr": 0.01567101081137168
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838229
},
"harness|gsm8k|5": {
"acc": 0.332827899924185,
"acc_stderr": 0.012979892496598268
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/selfira_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of selfira (Granblue Fantasy)
This is the dataset of selfira (Granblue Fantasy), containing 11 images and their tags.
The core tags of this character are `animal_ears, red_hair, long_hair, bangs, breasts, ponytail, mole, medium_breasts, brown_eyes, mole_under_eye, brown_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 9.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/selfira_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 6.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/selfira_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 25 | 13.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/selfira_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 8.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/selfira_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 25 | 15.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/selfira_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/selfira_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, erune, solo, looking_at_viewer, red_dress, bare_shoulders, simple_background, detached_sleeves, bare_back, cape, from_behind, looking_back, ass, backless_dress, blush, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | erune | solo | looking_at_viewer | red_dress | bare_shoulders | simple_background | detached_sleeves | bare_back | cape | from_behind | looking_back | ass | backless_dress | blush | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------------------|:------------|:-----------------|:--------------------|:-------------------|:------------|:-------|:--------------|:---------------|:------|:-----------------|:--------|:-------------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_jan-hq__LlamaCorn-1.1B-Chat | ---
pretty_name: Evaluation run of jan-hq/LlamaCorn-1.1B-Chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jan-hq/LlamaCorn-1.1B-Chat](https://huggingface.co/jan-hq/LlamaCorn-1.1B-Chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jan-hq__LlamaCorn-1.1B-Chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-12T10:29:25.854017](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__LlamaCorn-1.1B-Chat/blob/main/results_2024-03-12T10-29-25.854017.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.29373569196097277,\n\
\ \"acc_stderr\": 0.032243803435004895,\n \"acc_norm\": 0.2960484401036193,\n\
\ \"acc_norm_stderr\": 0.03310756115855655,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931586,\n \"mc2\": 0.36855840909843307,\n\
\ \"mc2_stderr\": 0.013989365630749612\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.318259385665529,\n \"acc_stderr\": 0.013611993916971451,\n\
\ \"acc_norm\": 0.3378839590443686,\n \"acc_norm_stderr\": 0.013822047922283516\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4482174865564629,\n\
\ \"acc_stderr\": 0.0049629497842360445,\n \"acc_norm\": 0.5924118701453893,\n\
\ \"acc_norm_stderr\": 0.004903815885983271\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.24342105263157895,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.24342105263157895,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2943396226415094,\n \"acc_stderr\": 0.028049186315695245,\n\
\ \"acc_norm\": 0.2943396226415094,\n \"acc_norm_stderr\": 0.028049186315695245\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610337,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610337\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309994,\n\
\ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309994\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2830687830687831,\n \"acc_stderr\": 0.023201392938194978,\n \"\
acc_norm\": 0.2830687830687831,\n \"acc_norm_stderr\": 0.023201392938194978\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.03852273364924316,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.03852273364924316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24193548387096775,\n\
\ \"acc_stderr\": 0.024362599693031083,\n \"acc_norm\": 0.24193548387096775,\n\
\ \"acc_norm_stderr\": 0.024362599693031083\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733545,\n\
\ \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733545\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3515151515151515,\n \"acc_stderr\": 0.0372820699868265,\n\
\ \"acc_norm\": 0.3515151515151515,\n \"acc_norm_stderr\": 0.0372820699868265\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2474747474747475,\n \"acc_stderr\": 0.03074630074212451,\n \"\
acc_norm\": 0.2474747474747475,\n \"acc_norm_stderr\": 0.03074630074212451\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.27461139896373055,\n \"acc_stderr\": 0.03221024508041153,\n\
\ \"acc_norm\": 0.27461139896373055,\n \"acc_norm_stderr\": 0.03221024508041153\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.022421273612923714,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.022421273612923714\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23703703703703705,\n \"acc_stderr\": 0.025928876132766107,\n \
\ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.025928876132766107\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.28991596638655465,\n \"acc_stderr\": 0.029472485833136094,\n\
\ \"acc_norm\": 0.28991596638655465,\n \"acc_norm_stderr\": 0.029472485833136094\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.03511807571804724,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.03511807571804724\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24770642201834864,\n \"acc_stderr\": 0.018508143602547822,\n \"\
acc_norm\": 0.24770642201834864,\n \"acc_norm_stderr\": 0.018508143602547822\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2824074074074074,\n \"acc_stderr\": 0.030701372111510934,\n \"\
acc_norm\": 0.2824074074074074,\n \"acc_norm_stderr\": 0.030701372111510934\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.29411764705882354,\n \"acc_stderr\": 0.031980016601150726,\n \"\
acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.031980016601150726\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3881856540084388,\n \"acc_stderr\": 0.031722950043323296,\n \
\ \"acc_norm\": 0.3881856540084388,\n \"acc_norm_stderr\": 0.031722950043323296\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.40358744394618834,\n\
\ \"acc_stderr\": 0.032928028193303135,\n \"acc_norm\": 0.40358744394618834,\n\
\ \"acc_norm_stderr\": 0.032928028193303135\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3282442748091603,\n \"acc_stderr\": 0.04118438565806299,\n\
\ \"acc_norm\": 0.3282442748091603,\n \"acc_norm_stderr\": 0.04118438565806299\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.36363636363636365,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.37037037037037035,\n\
\ \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.37037037037037035,\n\
\ \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.27184466019417475,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.27184466019417475,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3547008547008547,\n\
\ \"acc_stderr\": 0.03134250486245402,\n \"acc_norm\": 0.3547008547008547,\n\
\ \"acc_norm_stderr\": 0.03134250486245402\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3269476372924649,\n\
\ \"acc_stderr\": 0.01677490818013146,\n \"acc_norm\": 0.3269476372924649,\n\
\ \"acc_norm_stderr\": 0.01677490818013146\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.30057803468208094,\n \"acc_stderr\": 0.024685316867257792,\n\
\ \"acc_norm\": 0.30057803468208094,\n \"acc_norm_stderr\": 0.024685316867257792\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n\
\ \"acc_stderr\": 0.014400296429225622,\n \"acc_norm\": 0.24581005586592178,\n\
\ \"acc_norm_stderr\": 0.014400296429225622\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2679738562091503,\n \"acc_stderr\": 0.025360603796242567,\n\
\ \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.025360603796242567\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2993827160493827,\n \"acc_stderr\": 0.025483115601195466,\n\
\ \"acc_norm\": 0.2993827160493827,\n \"acc_norm_stderr\": 0.025483115601195466\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880596,\n \
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880596\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24511082138200782,\n\
\ \"acc_stderr\": 0.010986307870045503,\n \"acc_norm\": 0.24511082138200782,\n\
\ \"acc_norm_stderr\": 0.010986307870045503\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.21323529411764705,\n \"acc_stderr\": 0.02488097151229428,\n\
\ \"acc_norm\": 0.21323529411764705,\n \"acc_norm_stderr\": 0.02488097151229428\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2565359477124183,\n \"acc_stderr\": 0.017667841612378984,\n \
\ \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.017667841612378984\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22857142857142856,\n \"acc_stderr\": 0.026882144922307748,\n\
\ \"acc_norm\": 0.22857142857142856,\n \"acc_norm_stderr\": 0.026882144922307748\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2736318407960199,\n\
\ \"acc_stderr\": 0.03152439186555401,\n \"acc_norm\": 0.2736318407960199,\n\
\ \"acc_norm_stderr\": 0.03152439186555401\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\
\ \"acc_stderr\": 0.0357160923005348,\n \"acc_norm\": 0.30120481927710846,\n\
\ \"acc_norm_stderr\": 0.0357160923005348\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931586,\n \"mc2\": 0.36855840909843307,\n\
\ \"mc2_stderr\": 0.013989365630749612\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6148382004735596,\n \"acc_stderr\": 0.013676821287521419\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/jan-hq/LlamaCorn-1.1B-Chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|arc:challenge|25_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|gsm8k|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hellaswag|10_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T10-29-25.854017.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-12T10-29-25.854017.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- '**/details_harness|winogrande|5_2024-03-12T10-29-25.854017.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-12T10-29-25.854017.parquet'
- config_name: results
data_files:
- split: 2024_03_12T10_29_25.854017
path:
- results_2024-03-12T10-29-25.854017.parquet
- split: latest
path:
- results_2024-03-12T10-29-25.854017.parquet
---
# Dataset Card for Evaluation run of jan-hq/LlamaCorn-1.1B-Chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jan-hq/LlamaCorn-1.1B-Chat](https://huggingface.co/jan-hq/LlamaCorn-1.1B-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jan-hq__LlamaCorn-1.1B-Chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-12T10:29:25.854017](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__LlamaCorn-1.1B-Chat/blob/main/results_2024-03-12T10-29-25.854017.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.29373569196097277,
"acc_stderr": 0.032243803435004895,
"acc_norm": 0.2960484401036193,
"acc_norm_stderr": 0.03310756115855655,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931586,
"mc2": 0.36855840909843307,
"mc2_stderr": 0.013989365630749612
},
"harness|arc:challenge|25": {
"acc": 0.318259385665529,
"acc_stderr": 0.013611993916971451,
"acc_norm": 0.3378839590443686,
"acc_norm_stderr": 0.013822047922283516
},
"harness|hellaswag|10": {
"acc": 0.4482174865564629,
"acc_stderr": 0.0049629497842360445,
"acc_norm": 0.5924118701453893,
"acc_norm_stderr": 0.004903815885983271
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.24342105263157895,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.24342105263157895,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2943396226415094,
"acc_stderr": 0.028049186315695245,
"acc_norm": 0.2943396226415094,
"acc_norm_stderr": 0.028049186315695245
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610337,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309994,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309994
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2830687830687831,
"acc_stderr": 0.023201392938194978,
"acc_norm": 0.2830687830687831,
"acc_norm_stderr": 0.023201392938194978
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924316,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24193548387096775,
"acc_stderr": 0.024362599693031083,
"acc_norm": 0.24193548387096775,
"acc_norm_stderr": 0.024362599693031083
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733545,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733545
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3515151515151515,
"acc_stderr": 0.0372820699868265,
"acc_norm": 0.3515151515151515,
"acc_norm_stderr": 0.0372820699868265
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2474747474747475,
"acc_stderr": 0.03074630074212451,
"acc_norm": 0.2474747474747475,
"acc_norm_stderr": 0.03074630074212451
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27461139896373055,
"acc_stderr": 0.03221024508041153,
"acc_norm": 0.27461139896373055,
"acc_norm_stderr": 0.03221024508041153
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.022421273612923714,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.022421273612923714
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.025928876132766107,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.025928876132766107
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.28991596638655465,
"acc_stderr": 0.029472485833136094,
"acc_norm": 0.28991596638655465,
"acc_norm_stderr": 0.029472485833136094
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.03511807571804724,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.03511807571804724
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24770642201834864,
"acc_stderr": 0.018508143602547822,
"acc_norm": 0.24770642201834864,
"acc_norm_stderr": 0.018508143602547822
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2824074074074074,
"acc_stderr": 0.030701372111510934,
"acc_norm": 0.2824074074074074,
"acc_norm_stderr": 0.030701372111510934
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.031980016601150726,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.031980016601150726
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3881856540084388,
"acc_stderr": 0.031722950043323296,
"acc_norm": 0.3881856540084388,
"acc_norm_stderr": 0.031722950043323296
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.40358744394618834,
"acc_stderr": 0.032928028193303135,
"acc_norm": 0.40358744394618834,
"acc_norm_stderr": 0.032928028193303135
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3282442748091603,
"acc_stderr": 0.04118438565806299,
"acc_norm": 0.3282442748091603,
"acc_norm_stderr": 0.04118438565806299
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.04668408033024931,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.04668408033024931
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.27184466019417475,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.27184466019417475,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3547008547008547,
"acc_stderr": 0.03134250486245402,
"acc_norm": 0.3547008547008547,
"acc_norm_stderr": 0.03134250486245402
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3269476372924649,
"acc_stderr": 0.01677490818013146,
"acc_norm": 0.3269476372924649,
"acc_norm_stderr": 0.01677490818013146
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.30057803468208094,
"acc_stderr": 0.024685316867257792,
"acc_norm": 0.30057803468208094,
"acc_norm_stderr": 0.024685316867257792
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.014400296429225622,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.014400296429225622
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2679738562091503,
"acc_stderr": 0.025360603796242567,
"acc_norm": 0.2679738562091503,
"acc_norm_stderr": 0.025360603796242567
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2993827160493827,
"acc_stderr": 0.025483115601195466,
"acc_norm": 0.2993827160493827,
"acc_norm_stderr": 0.025483115601195466
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880596,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880596
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24511082138200782,
"acc_stderr": 0.010986307870045503,
"acc_norm": 0.24511082138200782,
"acc_norm_stderr": 0.010986307870045503
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21323529411764705,
"acc_stderr": 0.02488097151229428,
"acc_norm": 0.21323529411764705,
"acc_norm_stderr": 0.02488097151229428
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.017667841612378984,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.017667841612378984
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22857142857142856,
"acc_stderr": 0.026882144922307748,
"acc_norm": 0.22857142857142856,
"acc_norm_stderr": 0.026882144922307748
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2736318407960199,
"acc_stderr": 0.03152439186555401,
"acc_norm": 0.2736318407960199,
"acc_norm_stderr": 0.03152439186555401
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.0357160923005348,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.0357160923005348
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931586,
"mc2": 0.36855840909843307,
"mc2_stderr": 0.013989365630749612
},
"harness|winogrande|5": {
"acc": 0.6148382004735596,
"acc_stderr": 0.013676821287521419
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Junrulu/MT-Bench-Plus | ---
license: mit
---
A further human-annotated version of [MT Bench](https://arxiv.org/abs/2306.05685): more rounds and long-term questions.
Related paper: https://arxiv.org/abs/2308.08239. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.