datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
open-llm-leaderboard/details_mrm8488__mistral-7b-ft-h4-no_robots_instructions | ---
pretty_name: Evaluation run of mrm8488/mistral-7b-ft-h4-no_robots_instructions
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mrm8488/mistral-7b-ft-h4-no_robots_instructions](https://huggingface.co/mrm8488/mistral-7b-ft-h4-no_robots_instructions)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 1 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mrm8488__mistral-7b-ft-h4-no_robots_instructions\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-02T15:43:14.595425](https://huggingface.co/datasets/open-llm-leaderboard/details_mrm8488__mistral-7b-ft-h4-no_robots_instructions/blob/main/results_2023-12-02T15-43-14.595425.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.36694465504169826,\n\
\ \"acc_stderr\": 0.013275883047712211\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.36694465504169826,\n \"acc_stderr\": 0.013275883047712211\n\
\ }\n}\n```"
repo_url: https://huggingface.co/mrm8488/mistral-7b-ft-h4-no_robots_instructions
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_02T15_42_28.726427
path:
- '**/details_harness|gsm8k|5_2023-12-02T15-42-28.726427.parquet'
- split: 2023_12_02T15_42_53.272777
path:
- '**/details_harness|gsm8k|5_2023-12-02T15-42-53.272777.parquet'
- split: 2023_12_02T15_43_07.243379
path:
- '**/details_harness|gsm8k|5_2023-12-02T15-43-07.243379.parquet'
- split: 2023_12_02T15_43_14.595425
path:
- '**/details_harness|gsm8k|5_2023-12-02T15-43-14.595425.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-02T15-43-14.595425.parquet'
- config_name: results
data_files:
- split: 2023_12_02T15_42_28.726427
path:
- results_2023-12-02T15-42-28.726427.parquet
- split: 2023_12_02T15_42_53.272777
path:
- results_2023-12-02T15-42-53.272777.parquet
- split: 2023_12_02T15_43_07.243379
path:
- results_2023-12-02T15-43-07.243379.parquet
- split: 2023_12_02T15_43_14.595425
path:
- results_2023-12-02T15-43-14.595425.parquet
- split: latest
path:
- results_2023-12-02T15-43-14.595425.parquet
---
# Dataset Card for Evaluation run of mrm8488/mistral-7b-ft-h4-no_robots_instructions
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mrm8488/mistral-7b-ft-h4-no_robots_instructions
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [mrm8488/mistral-7b-ft-h4-no_robots_instructions](https://huggingface.co/mrm8488/mistral-7b-ft-h4-no_robots_instructions) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 1 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mrm8488__mistral-7b-ft-h4-no_robots_instructions",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T15:43:14.595425](https://huggingface.co/datasets/open-llm-leaderboard/details_mrm8488__mistral-7b-ft-h4-no_robots_instructions/blob/main/results_2023-12-02T15-43-14.595425.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.36694465504169826,
"acc_stderr": 0.013275883047712211
},
"harness|gsm8k|5": {
"acc": 0.36694465504169826,
"acc_stderr": 0.013275883047712211
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
akshaypt7/dreambooth-hackathon-images | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 936008.0
num_examples: 30
download_size: 0
dataset_size: 936008.0
---
# Dataset Card for "dreambooth-hackathon-images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dawood/dawood-theme |
---
tags: [gradio-theme]
---
# Dawood Theme
## Description
My Theme!
## Preview
Add an image preview of your theme here!
## Contributions
Thanks to [@dawood](https://huggingface.co/dawood) for adding this gradio theme!
|
migtissera/Hitchhiker | ---
license: apache-2.0
---
# Hitchhiker's Guide to the Galaxy
GPT-4-Turbo generations to elicit responses modelled on the Hitchhiker's Guide to the Galaxy.
Add some spice to your LLMs. Enjoy!

|
wadzaw/test | ---
license: mit
---
|
fancyzhx/c4_xz | ---
license: odc-by
---
AllenAI's C4 dataset reproduction compressed in xz. The files are half of the original gzipped version.
For information about the original dataset, refer to https://huggingface.co/datasets/allenai/c4 |
dylanmontoya22/biobert-ner-medical-text | ---
dataset_info:
features:
- name: text
dtype: string
- name: tokens
sequence: string
- name: annotation
list:
- name: end
dtype: int64
- name: label
dtype: string
- name: start
dtype: int64
splits:
- name: train
num_bytes: 117531.98
num_examples: 710
download_size: 24684
dataset_size: 117531.98
---
# Dataset Card for "biobert-ner-medical-text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tianyang/repobench-p | ---
language_creators:
- found
language:
- code
license:
- cc-by-nc-nd-4.0
multilinguality:
- multilingual
pretty_name: RepoBench-Pipeline
source_datasets:
- original
task_categories:
- text-retrieval
- text-generation
task_ids:
- document-retrieval
tags:
- code
---
# Dataset Card for RepoBench-P
## Dataset Description
- **Homepage:** https://github.com/Leolty/repobench
- **Paper:** https://arxiv.org/abs/2306.03091
## Dataset Summary
**RepoBench-P (Pipeline)** is a subtask of **RepoBench**([GitHub](https://github.com/Leolty/repobench), [arXiv](https://arxiv.org/abs/2306.03091)), combinig the retrieval and code completion tasks. Specifically, the retrieval task is used to retrieve the most relevant code snippet first, and then do the code completion task with retrieved code snippet as cross-file context for next-line prediction, which mirrors complex real-world scenarios that a practical auto-completion system would face.
## Settings
- `cff`: short for cross_file_first, indicating the cross-file module in next line is first used in the current file.
- `cfr`: short for cross_file_random, indicating the cross-file module in next line is not first used in the current file.
- `if`: short for in_file, indicating the next line does not contain any cross-file module.
## Supported Languages
- `python` and `java`
## Loading Data
For example, to load the `python` dataset, and you can provide the `split` argument to choose the specific setting.
```python
from datasets import load_dataset
dataset = load_dataset("tianyang/repobench-p", "python", split="cff")
```
> Note: The `split` argument is optional. If not provided, the entire dataset will be loaded.
## Dataset Structure
```json
{
"repo_name": "repository name of the data point",
"file_path": "path/to/current_file",
"context": [
{
"path": "path/to/cross_file_1",
"identifier": "identifier of the cross-file module",
"snippet": "the code snippet of the cross-file module",
"tokenized_snippet": "tokenized code snippet of the cross-file module"
},
// ...
{
"path": "path/to/cross_file_k",
"identifier": "identifier of the cross-file module",
"snippet": "the code snippet of the cross-file module",
"tokenized_snippet": "tokenized code snippet of the cross-file module"
},
],
"import_statement": "all import statements in current file",
"code": "the code for next-line prediction",
"next_line": "the next line of the code",
"gold_snippet_index": 2 // NOTE: Only for "cross_file_first" and "cross_file_random" settings, for "in_file" setting, we set it to -1.
}
```
## Licensing Information
CC BY-NC-ND 4.0
## Citation Information
```bibtex
@misc{liu2023repobench,
title={RepoBench: Benchmarking Repository-Level Code Auto-Completion Systems},
author={Tianyang Liu and Canwen Xu and Julian McAuley},
year={2023},
eprint={2306.03091},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## Contributions
Thanks to [@Leolty](https://github.com/Leolty) for adding this dataset. |
keremberke/football-object-detection | ---
task_categories:
- object-detection
tags:
- roboflow
---
### Roboflow Dataset Page
[https://universe.roboflow.com/augmented-startups/football-player-detection-kucab](https://universe.roboflow.com/augmented-startups/football-player-detection-kucab?ref=roboflow2huggingface)
### Citation
```
@misc{ football-player-detection-kucab_dataset,
title = { Football-Player-Detection Dataset },
type = { Open Source Dataset },
author = { Augmented Startups },
howpublished = { \url{ https://universe.roboflow.com/augmented-startups/football-player-detection-kucab } },
url = { https://universe.roboflow.com/augmented-startups/football-player-detection-kucab },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2022-12-29 },
}
```
### License
CC BY 4.0
### Dataset Summary
This dataset was exported via roboflow.com on November 21, 2022 at 6:50 PM GMT
Roboflow is an end-to-end computer vision platform that helps you
* collaborate with your team on computer vision projects
* collect & organize images
* understand unstructured image data
* annotate, and create datasets
* export, train, and deploy computer vision models
* use active learning to improve your dataset over time
It includes 1232 images.
Track-players-and-football are annotated in COCO format.
The following pre-processing was applied to each image:
* Auto-orientation of pixel data (with EXIF-orientation stripping)
No image augmentation techniques were applied.
|
Owishiboo/grammar-correction | ---
language:
- en
---
Basically used in Correctness Chorus to train T5 model to predict grammar correction. |
volvoDon/mr-golem | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 155724.0
num_examples: 19
- name: test
num_bytes: 24588.0
num_examples: 3
download_size: 103142
dataset_size: 180312.0
---
# Dataset Card for "mr-golem"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-bc0462a6-7584893 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xtreme
eval_info:
task: entity_extraction
model: olpa/xml-roberta-base-finetuned-panx-fr
metrics: []
dataset_name: xtreme
dataset_config: PAN-X.fr
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: olpa/xml-roberta-base-finetuned-panx-fr
* Dataset: xtreme
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
VKBrutus/Leonardo_Muller | ---
license: openrail
---
|
henryscheible/winobias | ---
dataset_info:
features:
- name: label
dtype: int64
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: eval
num_bytes: 230400
num_examples: 1584
- name: train
num_bytes: 226080
num_examples: 1584
download_size: 83948
dataset_size: 456480
---
# Dataset Card for "winobias"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
XxHimaruxX/Voice | ---
license: afl-3.0
---
|
CyberHarem/ak74m_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ak74m/AK74M/AK74M (Girls' Frontline)
This is the dataset of ak74m/AK74M/AK74M (Girls' Frontline), containing 87 images and their tags.
The core tags of this character are `long_hair, bangs, breasts, blue_eyes, blonde_hair, hair_ornament, hat, beret, medium_breasts, red_headwear, snowflake_hair_ornament, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 87 | 116.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ak74m_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 87 | 62.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ak74m_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 215 | 133.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ak74m_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 87 | 102.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ak74m_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 215 | 192.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ak74m_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ak74m_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, black_pantyhose, blush, solo, long_sleeves, looking_at_viewer, black_jacket, feet_out_of_frame, red_skirt, simple_background, white_background, closed_mouth, standing, smile, open_mouth, russian_text |
| 1 | 5 |  |  |  |  |  | 1girl, assault_rifle, black_footwear, black_pantyhose, full_body, kalashnikov_rifle, lace-up_boots, long_sleeves, red_skirt, solo, standing, black_jacket, cape, closed_mouth, holding_gun, looking_at_viewer, black_gloves, fingerless_gloves, knee_pads, pleated_skirt, russian_text, simple_background, white_background, blush, holster, knee_boots, knife, trigger_discipline |
| 2 | 5 |  |  |  |  |  | 1girl, simple_background, solo, upper_body, black_gloves, fingerless_gloves, long_sleeves, looking_at_viewer, blush, russian_text, smile, tactical_clothes, white_background, black_jacket, closed_mouth, open_mouth |
| 3 | 5 |  |  |  |  |  | blush, enmaided, looking_at_viewer, maid_headdress, 1girl, frills, hairclip, juliet_sleeves, maid_apron, open_mouth, solo, :o, black_dress, black_thighhighs, corset, detached_collar, double_v, feet_out_of_frame, garter_straps, neck_ribbon, ponytail, red_ribbon, simple_background, white_apron |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_pantyhose | blush | solo | long_sleeves | looking_at_viewer | black_jacket | feet_out_of_frame | red_skirt | simple_background | white_background | closed_mouth | standing | smile | open_mouth | russian_text | assault_rifle | black_footwear | full_body | kalashnikov_rifle | lace-up_boots | cape | holding_gun | black_gloves | fingerless_gloves | knee_pads | pleated_skirt | holster | knee_boots | knife | trigger_discipline | upper_body | tactical_clothes | enmaided | maid_headdress | frills | hairclip | juliet_sleeves | maid_apron | :o | black_dress | black_thighhighs | corset | detached_collar | double_v | garter_straps | neck_ribbon | ponytail | red_ribbon | white_apron |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------------|:--------|:-------|:---------------|:--------------------|:---------------|:--------------------|:------------|:--------------------|:-------------------|:---------------|:-----------|:--------|:-------------|:---------------|:----------------|:-----------------|:------------|:--------------------|:----------------|:-------|:--------------|:---------------|:--------------------|:------------|:----------------|:----------|:-------------|:--------|:---------------------|:-------------|:-------------------|:-----------|:-----------------|:---------|:-----------|:-----------------|:-------------|:-----|:--------------|:-------------------|:---------|:------------------|:-----------|:----------------|:--------------|:-----------|:-------------|:--------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | X | X | X | X | | | X | X | X | | X | X | X | | | | | | | | X | X | | | | | | | X | X | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | X | X | | X | | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
jdnvn/legal-llama2-4.4k-instruct | ---
license: apache-2.0
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 16562924
num_examples: 4394
download_size: 5252329
dataset_size: 16562924
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Yubing/Ubin | ---
license: openrail
---
|
SGBTalha/negaoRVCv2 | ---
license: openrail
---
|
Solshine/Olympia_WA_USA_Weather_2020s | ---
license: mit
---
|
victor/hf-spaces-with-descriptions | ---
language:
- en
---
# HF Spaces with Descriptions
A collection of Hugging Face Spaces with AI generated descriptions (using Mixtral). |
Alex-Song/Test | ---
license: apache-2.0
task_categories:
- translation
language:
- ja
- zh
- ar
tags:
- music
pretty_name: MTSpeech
size_categories:
- 1K<n<10K
extra_gated_prompt: "You agree to not attempt to determine the identity of individuals in this dataset"
extra_gated_fields:
Name: text
Email: text
Organization: text
Address: text
I agree to not attempt to determine the identity of speakers in this dataset: checkbox
I accept the terms of access: checkbox
viewer: false
---
|
TechieTeee/Chainlink_USDT_Data | ---
license: mit
---
|
zhangfei2023/cccc | ---
license: openrail
---
|
johannes-garstenauer/balanced_structs_reduced_labelled_large_new_key_addr | ---
dataset_info:
features:
- name: struct
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 78719500.0
num_examples: 279780
download_size: 21110038
dataset_size: 78719500.0
---
# Dataset Card for "balanced_structs_reduced_labelled_large_new_key_addr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ranWang/preview_alignment | ---
dataset_info:
features:
- name: zh
dtype: string
- name: en
dtype: string
splits:
- name: train
num_bytes: 10196107
num_examples: 17880
download_size: 5226449
dataset_size: 10196107
---
# Dataset Card for "preview_alignment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
buptwq/finetune-lora-sd | ---
license: cc
task_categories:
- text-to-image
language:
- en
---
# Why the online can not be used?
I can load data in my local path as :
```
from datasets import load_dataset
dataset = load_dataset("imagefolder", data_dir="/path/to/folder")
```
However, why the online does not work? |
Nerfgun3/ouroboros_embeddings | ---
language:
- en
license: creativeml-openrail-m
thumbnail: "https://huggingface.co/datasets/Nerfgun3/ouroboros_embeddings/resolve/main/ouroboros_showcase.jpg"
tags:
- stable-diffusion
- text-to-image
- image-to-image
inference: false
---
# Ouroboros Style Embeddings / Textual Inversion
<img alt="Showcase" src="https://huggingface.co/datasets/Nerfgun3/ouroboros_embeddings/resolve/main/ouroboros_showcase.jpg"/>
## Intro
Both embeddings are quiet similar in style, but were trained on a different dataset.
## Usage
To use my embeddings you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder
Personally, I would recommend to use my embeddings with a strength of 0.8, like ```"drawn by (filename:0.8)"```
I trained both embeddings two epochs until 8000 steps.
I hope you enjoy the embedding. If you have any questions, you can ask me anything via Discord: "Nerfgun3#7508"
### Dark ouroboros
This embedding was trained on a dataset with dark backgrounds.
To use it in a prompt: ```"drawn by dark_ouroboros"```
### White ouroboros
This embedding was trained on a dataset with white backgrounds.
To use it in a prompt: ```"drawn by white_ouroboros"```
## License
This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
jondurbin/airoboros-3.2 | ---
license: cc-by-4.0
tags:
- not-for-all-audiences
---
## Overview
This dataset is a continuation of the [airoboros-3.1](https://hf.co/datasets/jondurbin/airoboros-3.1) with the following changes:
* MathJSON has been removed for the time-being, because it seems to confuse the models at times, causing more problems than it's worth. The mathjson dataset can be found [here](https://huggingface.co/datasets/jondurbin/mathjson-alpha)
* The de-censorship data has been re-added, to ensure a non-DPO SFT model using this dataset is relatively uncensored.
* ~11k instructions from [slimorca](https://huggingface.co/datasets/Open-Orca/SlimOrca) where extended to have an additional, follow-up turn to enhance multi-turn capabilities.
## Format
The format is now in ShareGPT format, to better accomodate the OS ecosystem fine-tuning tooling.
## Usage restriction
To use this data, you must acknowledge/agree to the following:
- a small sampling of the data contained within is "toxic"/"harmful", and contains profanity and other types of sensitive content
- none of the content or views contained in the dataset necessarily align with my personal beliefs or opinions, they are simply text generated by LLMs without a great amount of validation
- you are able to use the dataset lawfully, particularly in locations with less-than-free speech laws
- you, and you alone are responsible for having downloaded and used the dataset, and I am completely indemnified from any and all liabilities
Also note that the data was generated primarily with gpt-4, and therefore may have some strings attached to the OpenAI terms of service. |
blinoff/kinopoisk | ---
language:
- ru
multilinguality:
- monolingual
pretty_name: Kinopoisk
size_categories:
- 10K<n<100K
task_categories:
- text-classification
task_ids:
- sentiment-classification
---
### Dataset Summary
Kinopoisk movie reviews dataset (TOP250 & BOTTOM100 rank lists).
In total it contains 36,591 reviews from July 2004 to November 2012.
With following distribution along the 3-point sentiment scale:
- Good: 27,264;
- Bad: 4,751;
- Neutral: 4,576.
### Data Fields
Each sample contains the following fields:
- **part**: rank list top250 or bottom100;
- **movie_name**;
- **review_id**;
- **author**: review author;
- **date**: date of a review;
- **title**: review title;
- **grade3**: sentiment score Good, Bad or Neutral;
- **grade10**: sentiment score on a 10-point scale parsed from text;
- **content**: review text.
### Python
```python3
import pandas as pd
df = pd.read_json('kinopoisk.jsonl', lines=True)
df.sample(5)
```
### Citation
```
@article{blinov2013research,
title={Research of lexical approach and machine learning methods for sentiment analysis},
author={Blinov, PD and Klekovkina, Maria and Kotelnikov, Eugeny and Pestov, Oleg},
journal={Computational Linguistics and Intellectual Technologies},
volume={2},
number={12},
pages={48--58},
year={2013}
}
```
|
tr416/dataset_20231006_201304 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 73952
dataset_size: 770400.0
---
# Dataset Card for "dataset_20231006_201304"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Chunt0/patrick_nation | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 8533138.0
num_examples: 40
download_size: 8531948
dataset_size: 8533138.0
---
# Dataset Card for "patrick_nation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kushinm/WID_sym_human_vs_ai | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-27000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 667324
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
smangrul/peft_docs | ---
license: apache-2.0
---
|
autoevaluate/autoeval-eval-cnn_dailymail-3.0.0-6c534f-38130145045 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: google/pegasus-cnn_dailymail
metrics: ['rouge', 'accuracy', 'bleu']
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/pegasus-cnn_dailymail
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@https://huggingface.co/Sini](https://huggingface.co/https://huggingface.co/Sini) for evaluating this model. |
mask-distilled-one-sec-cv12/chunk_65 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1200601944
num_examples: 235782
download_size: 1220462426
dataset_size: 1200601944
---
# Dataset Card for "chunk_65"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_cola_future_sub_gon | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 3630
num_examples: 40
- name: test
num_bytes: 3610
num_examples: 40
- name: train
num_bytes: 27267
num_examples: 339
download_size: 20545
dataset_size: 34507
---
# Dataset Card for "MULTI_VALUE_cola_future_sub_gon"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kbmurali/gpt2-qa-v2-train-ds | ---
license: apache-2.0
dataset_info:
features:
- name: qa_instruction
dtype: string
splits:
- name: train
num_bytes: 7992652
num_examples: 9000
download_size: 4841482
dataset_size: 7992652
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AIrtisian/useless-data | ---
license: unknown
---
|
liuyanchen1015/MULTI_VALUE_sst2_existential_got | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 2453
num_examples: 17
- name: test
num_bytes: 3837
num_examples: 27
- name: train
num_bytes: 36123
num_examples: 293
download_size: 24612
dataset_size: 42413
---
# Dataset Card for "MULTI_VALUE_sst2_existential_got"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_100000_Diabetes130US_sgosdt_l256_dim7_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: input_y_clean
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 2057200000
num_examples: 100000
- name: validation
num_bytes: 205720000
num_examples: 10000
download_size: 257403365
dataset_size: 2262920000
---
# Dataset Card for "autotree_automl_100000_Diabetes130US_sgosdt_l256_dim7_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
japanese-asr/whisper_transcriptions.reazonspeech.all_14 | ---
dataset_info:
config_name: all
features:
- name: name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 30573560048.0
num_examples: 268539
download_size: 30336563240
dataset_size: 30573560048.0
configs:
- config_name: all
data_files:
- split: train
path: all/train-*
---
|
open-llm-leaderboard/details_SilverCoder66__Mistral-7B-Instruct-adapt-v0.23 | ---
pretty_name: Evaluation run of SilverCoder66/Mistral-7B-Instruct-adapt-v0.23
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SilverCoder66/Mistral-7B-Instruct-adapt-v0.23](https://huggingface.co/SilverCoder66/Mistral-7B-Instruct-adapt-v0.23)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SilverCoder66__Mistral-7B-Instruct-adapt-v0.23\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-26T13:16:04.743245](https://huggingface.co/datasets/open-llm-leaderboard/details_SilverCoder66__Mistral-7B-Instruct-adapt-v0.23/blob/main/results_2024-01-26T13-16-04.743245.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6558565508085182,\n\
\ \"acc_stderr\": 0.03205699333246102,\n \"acc_norm\": 0.6552801158659124,\n\
\ \"acc_norm_stderr\": 0.03272709560202178,\n \"mc1\": 0.5667074663402693,\n\
\ \"mc1_stderr\": 0.017347024450107478,\n \"mc2\": 0.7126457863777319,\n\
\ \"mc2_stderr\": 0.014796561609011638\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.013363080107244484,\n\
\ \"acc_norm\": 0.7252559726962458,\n \"acc_norm_stderr\": 0.013044617212771227\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7127066321449911,\n\
\ \"acc_stderr\": 0.004515748192605716,\n \"acc_norm\": 0.8849830711013742,\n\
\ \"acc_norm_stderr\": 0.0031839033919416975\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138208,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138208\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608306,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608306\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.02353292543104429,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.02353292543104429\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n\
\ \"acc_stderr\": 0.01654788799741611,\n \"acc_norm\": 0.42793296089385474,\n\
\ \"acc_norm_stderr\": 0.01654788799741611\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02609016250427905,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02609016250427905\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.025403832978179615,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.025403832978179615\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n\
\ \"acc_stderr\": 0.012741974333897229,\n \"acc_norm\": 0.4667535853976532,\n\
\ \"acc_norm_stderr\": 0.012741974333897229\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5667074663402693,\n\
\ \"mc1_stderr\": 0.017347024450107478,\n \"mc2\": 0.7126457863777319,\n\
\ \"mc2_stderr\": 0.014796561609011638\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8389897395422258,\n \"acc_stderr\": 0.010329712832785722\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7020470053070508,\n \
\ \"acc_stderr\": 0.01259793223291452\n }\n}\n```"
repo_url: https://huggingface.co/SilverCoder66/Mistral-7B-Instruct-adapt-v0.23
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|arc:challenge|25_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|gsm8k|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hellaswag|10_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T13-16-04.743245.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T13-16-04.743245.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- '**/details_harness|winogrande|5_2024-01-26T13-16-04.743245.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-26T13-16-04.743245.parquet'
- config_name: results
data_files:
- split: 2024_01_26T13_16_04.743245
path:
- results_2024-01-26T13-16-04.743245.parquet
- split: latest
path:
- results_2024-01-26T13-16-04.743245.parquet
---
# Dataset Card for Evaluation run of SilverCoder66/Mistral-7B-Instruct-adapt-v0.23
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SilverCoder66/Mistral-7B-Instruct-adapt-v0.23](https://huggingface.co/SilverCoder66/Mistral-7B-Instruct-adapt-v0.23) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SilverCoder66__Mistral-7B-Instruct-adapt-v0.23",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-26T13:16:04.743245](https://huggingface.co/datasets/open-llm-leaderboard/details_SilverCoder66__Mistral-7B-Instruct-adapt-v0.23/blob/main/results_2024-01-26T13-16-04.743245.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6558565508085182,
"acc_stderr": 0.03205699333246102,
"acc_norm": 0.6552801158659124,
"acc_norm_stderr": 0.03272709560202178,
"mc1": 0.5667074663402693,
"mc1_stderr": 0.017347024450107478,
"mc2": 0.7126457863777319,
"mc2_stderr": 0.014796561609011638
},
"harness|arc:challenge|25": {
"acc": 0.7022184300341296,
"acc_stderr": 0.013363080107244484,
"acc_norm": 0.7252559726962458,
"acc_norm_stderr": 0.013044617212771227
},
"harness|hellaswag|10": {
"acc": 0.7127066321449911,
"acc_stderr": 0.004515748192605716,
"acc_norm": 0.8849830711013742,
"acc_norm_stderr": 0.0031839033919416975
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138208,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138208
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290902,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290902
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608306,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608306
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.02353292543104429,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.02353292543104429
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42793296089385474,
"acc_stderr": 0.01654788799741611,
"acc_norm": 0.42793296089385474,
"acc_norm_stderr": 0.01654788799741611
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179615,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179615
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959607,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959607
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897229,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897229
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5667074663402693,
"mc1_stderr": 0.017347024450107478,
"mc2": 0.7126457863777319,
"mc2_stderr": 0.014796561609011638
},
"harness|winogrande|5": {
"acc": 0.8389897395422258,
"acc_stderr": 0.010329712832785722
},
"harness|gsm8k|5": {
"acc": 0.7020470053070508,
"acc_stderr": 0.01259793223291452
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
qa_srl | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- multiple-choice-qa
- open-domain-qa
paperswithcode_id: qa-srl
pretty_name: QA-SRL
dataset_info:
features:
- name: sentence
dtype: string
- name: sent_id
dtype: string
- name: predicate_idx
dtype: int32
- name: predicate
dtype: string
- name: question
sequence: string
- name: answers
sequence: string
config_name: plain_text
splits:
- name: train
num_bytes: 1835549
num_examples: 6414
- name: validation
num_bytes: 632992
num_examples: 2183
- name: test
num_bytes: 637317
num_examples: 2201
download_size: 1087729
dataset_size: 3105858
---
# Dataset Card for QA-SRL
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Homepage](https://dada.cs.washington.edu/qasrl/#page-top)
- **Annotation Tool:** [Annotation tool](https://github.com/luheng/qasrl_annotation)
- **Repository:** [Repository](https://dada.cs.washington.edu/qasrl/#dataset)
- **Paper:** [Qa_srl paper](https://www.aclweb.org/anthology/D15-1076.pdf)
- **Point of Contact:** [Luheng He](luheng@cs.washington.edu)
### Dataset Summary
we model predicate-argument structure of a sentence with a set of question-answer pairs. our method allows practical large-scale annotation of training data. We focus on semantic rather than syntactic annotation, and introduce a scalable method for gathering data that allows both training and evaluation.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
This dataset is in english language.
## Dataset Structure
### Data Instances
We use question-answer pairs to model verbal predicate-argument structure. The questions start with wh-words (Who, What, Where, What, etc.) and contains a verb predicate in the sentence; the answers are phrases in the sentence. For example:
`UCD finished the 2006 championship as Dublin champions , by beating St Vincents in the final .`
Predicate | Question | Answer
---|---|---|
|Finished|Who finished something? | UCD
|Finished|What did someone finish?|the 2006 championship
|Finished|What did someone finish something as? |Dublin champions
|Finished|How did someone finish something? |by beating St Vincents in the final
|beating | Who beat someone? | UCD
|beating|When did someone beat someone? |in the final
|beating|Who did someone beat?| St Vincents
### Data Fields
Annotations provided are as follows:
- `sentence`: contains tokenized sentence
- `sent_id`: is the sentence identifier
- `predicate_idx`:the index of the predicate (its position in the sentence)
- `predicate`: the predicate token
- `question`: contains the question which is a list of tokens. The question always consists of seven slots, as defined in the paper. The empty slots are represented with a marker “_”. The question ends with question mark.
- `answer`: list of answers to the question
### Data Splits
Dataset | Sentences | Verbs | QAs
--- | --- | --- |---|
**newswire-train**|744|2020|4904|
**newswire-dev**|249|664|1606|
**newswire-test**|248|652|1599
**Wikipedia-train**|`1174`|`2647`|`6414`|
**Wikipedia-dev**|`392`|`895`|`2183`|
**Wikipedia-test**|`393`|`898`|`2201`|
**Please note**
This dataset only has wikipedia data. Newswire dataset needs CoNLL-2009 English training data to get the complete data. This training data is under license. Thus, newswire dataset is not included in this data.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
We annotated over 3000 sentences (nearly 8,000 verbs) in total across two domains: newswire (PropBank) and Wikipedia.
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
non-expert annotators were given a short tutorial and a small set of sample annotations (about 10 sentences). Annotators were hired if they showed good understanding of English and the task. The entire screening process usually took less than 2 hours.
#### Who are the annotators?
10 part-time, non-exper annotators from Upwork (Previously oDesk)
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[Luheng He](luheng@cs.washington.edu)
### Licensing Information
[More Information Needed]
### Citation Information
```
@InProceedings{huggingface:dataset,
title = {QA-SRL: Question-Answer Driven Semantic Role Labeling},
authors={Luheng He, Mike Lewis, Luke Zettlemoyer},
year={2015}
publisher = {cs.washington.edu},
howpublished={\\url{https://dada.cs.washington.edu/qasrl/#page-top}},
}
```
### Contributions
Thanks to [@bpatidar](https://github.com/bpatidar) for adding this dataset. |
tyzhu/squad_qa_rare_v5_full_recite_ans_sent_random_permute_rerun_8 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 10114704.024471635
num_examples: 6305
- name: validation
num_bytes: 405531
num_examples: 300
download_size: 1645263
dataset_size: 10520235.024471635
---
# Dataset Card for "squad_qa_rare_v5_full_recite_ans_sent_random_permute_rerun_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gardner/Magicoder-OSS-Instruct-75K-sharegpt | ---
dataset_info:
features:
- name: conversations
dtype: string
splits:
- name: train
num_bytes: 186950396
num_examples: 75197
download_size: 72570993
dataset_size: 186950396
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Sketched33/Cities_Geographic_Historic_Cultural_Data | ---
license: apache-2.0
dataset_info:
features:
- name: city_name
dtype: string
- name: latitude
dtype: float64
- name: longitude
dtype: float64
- name: data_type
dtype: string
- name: data
dtype: string
splits:
- name: train
num_bytes: 296650
num_examples: 450
download_size: 155196
dataset_size: 296650
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
EmnaBou/DataTranslationDT | ---
annotations_creators:
- found
language_creators:
- found
language:
- ar
license:
- unknown
multilinguality:
- multilingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- translation
task_ids: []
paperswithcode_id: null
pretty_name: DataTranslationDT
dataset_info:
- config_name: disluent_fluent
features:
- name: translation
dtype:
translation:
languages:
- disfluent
- fluent
- name: id
dtype: string
---
# Dataset Card for DataTranslationDT
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:** None
- **Paper:**
- **Leaderboard:** [More Information Needed]
- **Point of Contact:** [More Information Needed]
### Dataset Summary
`dataset = load_dataset("DataTranslationDT", lang1="disfluent", lang2="fluent")`
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
|
AdapterOcean/Open_Platypus_standardized_cluster_6_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2155668
num_examples: 2013
download_size: 1081401
dataset_size: 2155668
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Open_Platypus_standardized_cluster_6_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
karmiq/wikipedia-embeddings-cs-e5-small | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: chunks
sequence: string
- name: embeddings
sequence:
sequence: float32
splits:
- name: train
num_bytes: 3302394852
num_examples: 534044
download_size: 3029933751
dataset_size: 3302394852
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- cs
size_categories:
- 100K<n<1M
task_categories:
- text-generation
- fill-mask
license:
- cc-by-sa-3.0
- gfdl
---
This dataset contains the Czech subset of the [`wikimedia/wikipedia`](https://huggingface.co/datasets/wikimedia/wikipedia) dataset. Each page is divided into paragraphs, stored as a list in the `chunks` column. For every paragraph, embeddings are created using the [`intfloat/multilingual-e5-small`](https://huggingface.co/intfloat/multilingual-e5-small) model.
## Usage
Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("karmiq/wikipedia-embeddings-cs-e5-small", split="train")
ds[1]
```
```
{
'id': '1',
'url': 'https://cs.wikipedia.org/wiki/Astronomie',
'title': 'Astronomie',
'chunks': [
'Astronomie, řecky αστρονομία z άστρον ( astron ) hvězda a νόμος ( nomos )...',
'Myšlenky Aristotelovy rozvinul ve 2. století našeho letopočtu Klaudios Ptolemaios...',
...,
],
'embeddings': [
[0.09006806463003159, -0.009814552962779999, ...],
[0.10767366737127304, ...],
...
]
}
```
The structure makes it easy to use the dataset for implementing semantic search.
<details>
<summary>Load the data in Elasticsearch</summary>
```python
def doc_generator(data, batch_size=1000):
for batch in data.with_format("numpy").iter(batch_size):
for i, id in enumerate(batch["id"]):
output = {"id": id}
output["title"] = batch["title"][i]
output["url"] = batch["url"][i]
output["parts"] = [
{ "chunk": chunk, "embedding": embedding }
for chunk, embedding in zip(batch["chunks"][i], batch["embeddings"][i])
]
yield output
num_indexed, num_failed = 0, 0,
progress = tqdm(total=ds.num_rows, unit="doc", desc="Indexing")
for ok, info in parallel_bulk(
es,
index="wikipedia-search",
actions=doc_generator(ds),
raise_on_error=False,
):
if not ok:
print(f"ERROR {info['index']['status']}: "
f"{info['index']['error']['type']}: {info['index']['error']['caused_by']['type']}: "
f"{info['index']['error']['caused_by']['reason'][:250]}")
progress.update(1)
```
</details>
<details>
<summary>Use <code>sentence_transformers.util.semantic_search</code></summary>
```python
import sentence_transformers
model = sentence_transformers.SentenceTransformer("intfloat/multilingual-e5-small")
ds.set_format(type="torch", columns=["embeddings"], output_all_columns=True)
# Flatten the dataset
def explode_sequence(batch):
output = { "id": [], "url": [], "title": [], "chunk": [], "embedding": [] }
for id, url, title, chunks, embeddings in zip(
batch["id"], batch["url"], batch["title"], batch["chunks"], batch["embeddings"]
):
output["id"].extend([id for _ in range(len(chunks))])
output["url"].extend([url for _ in range(len(chunks))])
output["title"].extend([title for _ in range(len(chunks))])
output["chunk"].extend(chunks)
output["embedding"].extend(embeddings)
return output
ds_flat = ds.map(
explode_sequence,
batched=True,
remove_columns=ds.column_names,
num_proc=min(os.cpu_count(), 32),
desc="Flatten")
ds_flat
query = "Čím se zabývá fyzika?"
hits = sentence_transformers.util.semantic_search(
query_embeddings=model.encode(query),
corpus_embeddings=ds_flat["embedding"],
top_k=10)
for hit in hits[0]:
title = ds_flat[hit['corpus_id']]['title']
chunk = ds_flat[hit['corpus_id']]['chunk']
print(f"[{hit['score']:0.2f}] {textwrap.shorten(chunk, width=100, placeholder='…')} [{title}]")
# [0.90] Fyzika částic ( též částicová fyzika ) je oblast fyziky, která se zabývá částicemi. V širším smyslu… [Fyzika částic]
# [0.89] Fyzika ( z řeckého φυσικός ( fysikos ): přírodní, ze základu φύσις ( fysis ): příroda, archaicky… [Fyzika]
# ...
```
</details>
The embeddings generation took about 1 hour on an NVIDIA A100 80GB GPU.
## License
See license of the original dataset: <https://huggingface.co/datasets/wikimedia/wikipedia>.
|
fuyu-quant/ibl-regression-ver5-linear | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: index
dtype: int64
- name: category
dtype: string
splits:
- name: train
num_bytes: 283673637
num_examples: 100000
- name: test
num_bytes: 2834413
num_examples: 1000
download_size: 170235128
dataset_size: 286508050
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_uproai__RosMistral-2x7B | ---
pretty_name: Evaluation run of uproai/RosMistral-2x7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uproai/RosMistral-2x7B](https://huggingface.co/uproai/RosMistral-2x7B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uproai__RosMistral-2x7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-21T12:30:40.736831](https://huggingface.co/datasets/open-llm-leaderboard/details_uproai__RosMistral-2x7B/blob/main/results_2024-02-21T12-30-40.736831.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6551960182700177,\n\
\ \"acc_stderr\": 0.03189102529877818,\n \"acc_norm\": 0.6570328046732227,\n\
\ \"acc_norm_stderr\": 0.03252818516001897,\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.016801860466677147,\n \"mc2\": 0.5287191041315256,\n\
\ \"mc2_stderr\": 0.01534150118647353\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6339590443686007,\n \"acc_stderr\": 0.01407722310847014,\n\
\ \"acc_norm\": 0.6621160409556314,\n \"acc_norm_stderr\": 0.013822047922283509\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6714797849034057,\n\
\ \"acc_stderr\": 0.00468715199479107,\n \"acc_norm\": 0.8554072893845848,\n\
\ \"acc_norm_stderr\": 0.0035097096477918433\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.032081157507886836,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.032081157507886836\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3851851851851852,\n \"acc_stderr\": 0.02967090612463088,\n \
\ \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.02967090612463088\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342853,\n\
\ \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342853\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374291,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374291\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.034063153607115086,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.034063153607115086\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156861,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156861\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"\
acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n\
\ \"acc_stderr\": 0.0133064782430663,\n \"acc_norm\": 0.8339719029374202,\n\
\ \"acc_norm_stderr\": 0.0133064782430663\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500107,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500107\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3843575418994413,\n\
\ \"acc_stderr\": 0.016269088663959402,\n \"acc_norm\": 0.3843575418994413,\n\
\ \"acc_norm_stderr\": 0.016269088663959402\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982478,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982478\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7654320987654321,\n \"acc_stderr\": 0.023576881744005723,\n\
\ \"acc_norm\": 0.7654320987654321,\n \"acc_norm_stderr\": 0.023576881744005723\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n\
\ \"acc_stderr\": 0.012740853872949837,\n \"acc_norm\": 0.4661016949152542,\n\
\ \"acc_norm_stderr\": 0.012740853872949837\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.027678468642144717,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.027678468642144717\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.018850084696468712,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.018850084696468712\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.016801860466677147,\n \"mc2\": 0.5287191041315256,\n\
\ \"mc2_stderr\": 0.01534150118647353\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7924230465666929,\n \"acc_stderr\": 0.011398593419386772\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.621683093252464,\n \
\ \"acc_stderr\": 0.013358407831777105\n }\n}\n```"
repo_url: https://huggingface.co/uproai/RosMistral-2x7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|arc:challenge|25_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|gsm8k|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hellaswag|10_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T12-30-40.736831.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-21T12-30-40.736831.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- '**/details_harness|winogrande|5_2024-02-21T12-30-40.736831.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-21T12-30-40.736831.parquet'
- config_name: results
data_files:
- split: 2024_02_21T12_30_40.736831
path:
- results_2024-02-21T12-30-40.736831.parquet
- split: latest
path:
- results_2024-02-21T12-30-40.736831.parquet
---
# Dataset Card for Evaluation run of uproai/RosMistral-2x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [uproai/RosMistral-2x7B](https://huggingface.co/uproai/RosMistral-2x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uproai__RosMistral-2x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-21T12:30:40.736831](https://huggingface.co/datasets/open-llm-leaderboard/details_uproai__RosMistral-2x7B/blob/main/results_2024-02-21T12-30-40.736831.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6551960182700177,
"acc_stderr": 0.03189102529877818,
"acc_norm": 0.6570328046732227,
"acc_norm_stderr": 0.03252818516001897,
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677147,
"mc2": 0.5287191041315256,
"mc2_stderr": 0.01534150118647353
},
"harness|arc:challenge|25": {
"acc": 0.6339590443686007,
"acc_stderr": 0.01407722310847014,
"acc_norm": 0.6621160409556314,
"acc_norm_stderr": 0.013822047922283509
},
"harness|hellaswag|10": {
"acc": 0.6714797849034057,
"acc_stderr": 0.00468715199479107,
"acc_norm": 0.8554072893845848,
"acc_norm_stderr": 0.0035097096477918433
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.032081157507886836,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.032081157507886836
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.02967090612463088,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.02967090612463088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.029719142876342853,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.029719142876342853
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374291,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374291
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.034063153607115086,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.034063153607115086
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156861,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156861
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917671,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.0133064782430663,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.0133064782430663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500107,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500107
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3843575418994413,
"acc_stderr": 0.016269088663959402,
"acc_norm": 0.3843575418994413,
"acc_norm_stderr": 0.016269088663959402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7654320987654321,
"acc_stderr": 0.023576881744005723,
"acc_norm": 0.7654320987654321,
"acc_norm_stderr": 0.023576881744005723
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.012740853872949837,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.012740853872949837
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.027678468642144717,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.027678468642144717
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.018850084696468712,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.018850084696468712
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677147,
"mc2": 0.5287191041315256,
"mc2_stderr": 0.01534150118647353
},
"harness|winogrande|5": {
"acc": 0.7924230465666929,
"acc_stderr": 0.011398593419386772
},
"harness|gsm8k|5": {
"acc": 0.621683093252464,
"acc_stderr": 0.013358407831777105
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/albion_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of albion/アルビオン/阿尔比恩 (Azur Lane)
This is the dataset of albion/アルビオン/阿尔比恩 (Azur Lane), containing 52 images and their tags.
The core tags of this character are `long_hair, breasts, pointy_ears, blue_eyes, large_breasts, very_long_hair, bangs, white_hair, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 52 | 85.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/albion_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 52 | 44.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/albion_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 132 | 95.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/albion_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 52 | 73.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/albion_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 132 | 142.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/albion_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/albion_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, bare_shoulders, blush, cleavage, elbow_gloves, elf, looking_at_viewer, navel, solo, bridal_gauntlets, revealing_clothes, white_gloves, white_skirt, closed_mouth, jewelry, simple_background, smile |
| 1 | 5 |  |  |  |  |  | 1girl, black_gloves, demon_horns, elbow_gloves, fur_trim, solo, underboob_cutout, bare_shoulders, looking_at_viewer, simple_background, official_alternate_costume, open_mouth, white_background, black_dress, blush, covered_nipples, sitting |
| 2 | 5 |  |  |  |  |  | 1girl, bare_shoulders, demon_girl, demon_horns, demon_wings, elbow_gloves, looking_at_viewer, solo, underboob, black_gloves, black_dress, blush, tail, thighs, asymmetrical_gloves, black_wings, crossed_legs, curled_horns, fur_trim, long_dress, parted_lips, sitting, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | blush | cleavage | elbow_gloves | elf | looking_at_viewer | navel | solo | bridal_gauntlets | revealing_clothes | white_gloves | white_skirt | closed_mouth | jewelry | simple_background | smile | black_gloves | demon_horns | fur_trim | underboob_cutout | official_alternate_costume | open_mouth | white_background | black_dress | covered_nipples | sitting | demon_girl | demon_wings | underboob | tail | thighs | asymmetrical_gloves | black_wings | crossed_legs | curled_horns | long_dress | parted_lips |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------|:-----------|:---------------|:------|:--------------------|:--------|:-------|:-------------------|:--------------------|:---------------|:--------------|:---------------|:----------|:--------------------|:--------|:---------------|:--------------|:-----------|:-------------------|:-----------------------------|:-------------|:-------------------|:--------------|:------------------|:----------|:-------------|:--------------|:------------|:-------|:---------|:----------------------|:--------------|:---------------|:---------------|:-------------|:--------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | | X | | X | | X | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | | X | | X | | X | | | | | | | | X | X | X | X | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X |
|
phatjk/odqa_data | ---
dataset_info:
features:
- name: text
dtype: string
- name: words
sequence: string
splits:
- name: train
num_bytes: 3515490316
num_examples: 1966167
download_size: 1364666872
dataset_size: 3515490316
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "odqa_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
owanr/o1o2o3_large_r2_coedit_with_human_pref | ---
dataset_info:
features:
- name: src
dtype: string
- name: tgt
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 63142292
num_examples: 206716
download_size: 9041822
dataset_size: 63142292
---
# Dataset Card for "o1o2o3_large_r2_coedit_with_human_pref"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Weyaxi__Dolphin2.1-OpenOrca-7B | ---
pretty_name: Evaluation run of Weyaxi/Dolphin2.1-OpenOrca-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/Dolphin2.1-OpenOrca-7B](https://huggingface.co/Weyaxi/Dolphin2.1-OpenOrca-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Dolphin2.1-OpenOrca-7B_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-09T14:21:27.933712](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Dolphin2.1-OpenOrca-7B_public/blob/main/results_2023-11-09T14-21-27.933712.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6224567348896717,\n\
\ \"acc_stderr\": 0.032466479047476085,\n \"acc_norm\": 0.6308724361156662,\n\
\ \"acc_norm_stderr\": 0.033159611933737225,\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836886,\n \"mc2\": 0.538254375639854,\n\
\ \"mc2_stderr\": 0.015244755693358225,\n \"em\": 0.0030411073825503355,\n\
\ \"em_stderr\": 0.0005638896908753155,\n \"f1\": 0.08151740771812048,\n\
\ \"f1_stderr\": 0.0016591952257614033\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6083617747440273,\n \"acc_stderr\": 0.01426412212493821,\n\
\ \"acc_norm\": 0.6416382252559727,\n \"acc_norm_stderr\": 0.014012883334859857\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.650368452499502,\n\
\ \"acc_stderr\": 0.004758790172436687,\n \"acc_norm\": 0.8424616610237005,\n\
\ \"acc_norm_stderr\": 0.0036356303524759065\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246487,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246487\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n\
\ \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n\
\ \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139403,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139403\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \
\ \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.03077805742293167,\n \
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.03077805742293167\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200148,\n \"\
acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200148\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601436,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601436\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.02220930907316561,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.02220930907316561\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\
\ \"acc_stderr\": 0.014143970276657567,\n \"acc_norm\": 0.8058748403575989,\n\
\ \"acc_norm_stderr\": 0.014143970276657567\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.024405173935783234,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.024405173935783234\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3340782122905028,\n\
\ \"acc_stderr\": 0.015774911422381625,\n \"acc_norm\": 0.3340782122905028,\n\
\ \"acc_norm_stderr\": 0.015774911422381625\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.02638527370346449,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.02638527370346449\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.02500646975579921,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.02500646975579921\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666904,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4621903520208605,\n\
\ \"acc_stderr\": 0.012733671880342507,\n \"acc_norm\": 0.4621903520208605,\n\
\ \"acc_norm_stderr\": 0.012733671880342507\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.029520095697687765,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.029520095697687765\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6470588235294118,\n \"acc_stderr\": 0.01933314202079716,\n \
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.01933314202079716\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291296,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291296\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836886,\n \"mc2\": 0.538254375639854,\n\
\ \"mc2_stderr\": 0.015244755693358225\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.011705697565205193\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0030411073825503355,\n \
\ \"em_stderr\": 0.0005638896908753155,\n \"f1\": 0.08151740771812048,\n\
\ \"f1_stderr\": 0.0016591952257614033\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.19711902956785443,\n \"acc_stderr\": 0.01095802163030062\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/Dolphin2.1-OpenOrca-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|arc:challenge|25_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|arc:challenge|25_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|drop|3_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|drop|3_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|gsm8k|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|gsm8k|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hellaswag|10_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hellaswag|10_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T14-13-23.628272.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T14-21-27.933712.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-09T14-21-27.933712.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- '**/details_harness|winogrande|5_2023-11-09T14-13-23.628272.parquet'
- split: 2023_11_09T14_21_27.933712
path:
- '**/details_harness|winogrande|5_2023-11-09T14-21-27.933712.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-09T14-21-27.933712.parquet'
- config_name: results
data_files:
- split: 2023_11_09T14_13_23.628272
path:
- results_2023-11-09T14-13-23.628272.parquet
- split: 2023_11_09T14_21_27.933712
path:
- results_2023-11-09T14-21-27.933712.parquet
- split: latest
path:
- results_2023-11-09T14-21-27.933712.parquet
---
# Dataset Card for Evaluation run of Weyaxi/Dolphin2.1-OpenOrca-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/Dolphin2.1-OpenOrca-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/Dolphin2.1-OpenOrca-7B](https://huggingface.co/Weyaxi/Dolphin2.1-OpenOrca-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Dolphin2.1-OpenOrca-7B_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-09T14:21:27.933712](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Dolphin2.1-OpenOrca-7B_public/blob/main/results_2023-11-09T14-21-27.933712.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6224567348896717,
"acc_stderr": 0.032466479047476085,
"acc_norm": 0.6308724361156662,
"acc_norm_stderr": 0.033159611933737225,
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836886,
"mc2": 0.538254375639854,
"mc2_stderr": 0.015244755693358225,
"em": 0.0030411073825503355,
"em_stderr": 0.0005638896908753155,
"f1": 0.08151740771812048,
"f1_stderr": 0.0016591952257614033
},
"harness|arc:challenge|25": {
"acc": 0.6083617747440273,
"acc_stderr": 0.01426412212493821,
"acc_norm": 0.6416382252559727,
"acc_norm_stderr": 0.014012883334859857
},
"harness|hellaswag|10": {
"acc": 0.650368452499502,
"acc_stderr": 0.004758790172436687,
"acc_norm": 0.8424616610237005,
"acc_norm_stderr": 0.0036356303524759065
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926605,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246487,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246487
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139403,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139403
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.02443301646605246,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.02443301646605246
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.03077805742293167,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.03077805742293167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8128440366972477,
"acc_stderr": 0.016722684526200148,
"acc_norm": 0.8128440366972477,
"acc_norm_stderr": 0.016722684526200148
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601436,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.02220930907316561,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.02220930907316561
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.014143970276657567,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.014143970276657567
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.024405173935783234,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.024405173935783234
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3340782122905028,
"acc_stderr": 0.015774911422381625,
"acc_norm": 0.3340782122905028,
"acc_norm_stderr": 0.015774911422381625
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.02638527370346449,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.02638527370346449
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.02500646975579921,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.02500646975579921
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666904,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4621903520208605,
"acc_stderr": 0.012733671880342507,
"acc_norm": 0.4621903520208605,
"acc_norm_stderr": 0.012733671880342507
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.029520095697687765,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.029520095697687765
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.01933314202079716,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.01933314202079716
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291296,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836886,
"mc2": 0.538254375639854,
"mc2_stderr": 0.015244755693358225
},
"harness|winogrande|5": {
"acc": 0.77663772691397,
"acc_stderr": 0.011705697565205193
},
"harness|drop|3": {
"em": 0.0030411073825503355,
"em_stderr": 0.0005638896908753155,
"f1": 0.08151740771812048,
"f1_stderr": 0.0016591952257614033
},
"harness|gsm8k|5": {
"acc": 0.19711902956785443,
"acc_stderr": 0.01095802163030062
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
FVilmar/amado_batista | ---
license: openrail
---
|
aniketr/pickapic-embeds | ---
license: mit
---
|
liuyanchen1015/MULTI_VALUE_stsb_my_i | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 7459
num_examples: 38
- name: test
num_bytes: 1649
num_examples: 11
- name: train
num_bytes: 3186
num_examples: 23
download_size: 17203
dataset_size: 12294
---
# Dataset Card for "MULTI_VALUE_stsb_my_i"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chrystians/oasst1_pl_3_threads | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3831992
num_examples: 9317
- name: validation
num_bytes: 122120
num_examples: 348
download_size: 1929012
dataset_size: 3954112
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
vanande/jorj | ---
task_categories:
- question-answering
language:
- en
size_categories:
- n<1K
--- |
tramzel/myfooddata_1_4 | ---
license: unknown
---
|
maixbach/insert-vnese-accent-20240408 | ---
dataset_info:
features:
- name: index
dtype: int64
- name: Input
dtype: string
- name: Output
dtype: string
- name: Sentence_length
dtype: int64
- name: long_text
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 9286097
num_examples: 2000
download_size: 4365542
dataset_size: 9286097
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
narySt/github_commits | ---
license: mit
dataset_info:
features:
- name: input_ids
sequence: int64
- name: attention_mask
sequence: int64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 257967900
num_examples: 20973
- name: val
num_bytes: 45891300
num_examples: 3731
download_size: 10916827
dataset_size: 303859200
language:
- en
pretty_name: github-commits
size_categories:
- n<1K
---
This dataset contains code changes in each commit of most starred python project, stored on GitHub.
## Code to reproduce the parsing process
To parse code we performed the following steps:
* Get list of most starred GitHub repos via API
* With **git** python package clone all the repos from the list to local machine and write code defference for each commit of every repo to the dataset.
* Clean dataset to remove to large commits, commits with not python code changes, commits with non-ASCII chars, etc.
* Group files changed in 1 commit into single sample of the dataset.
To reproduce these steps you need to:
1) run *src/github_parsing.ipynb* to parse repos from github
2) to clean the data and group dataset samples run *src/data_cleaning.ipynb*
## Dataset features
Dataset have the following features:
1) repo_name
2) commit_message
3) commit_changes - changes in code in all python files, contained in the commit
4) files_changed - number of files, changed in the commit
5) changes_len - number of chars in the code changes
For model training we used only *commit_message* feature as a label and *commit_changes* as an input for the model.
Code changes have the following structure:
```
<filename> name_of_the_file <filename>
code_of_changes
<commit_msg>
```
Special tokens used in the input:
* <file_name> - used to separate name of the file
* <code_del> and <code_add> used to separate added or deleted lines of code in the commit
* <commit_msg> used to separate commit message
Example of input for the model:
```
<filename> a/tests/test_constraint.py b/tests/test_constraint.py<filename>
<code_del>--- a/tests/test_constraint.py<code_del>
<code_add>+++ b/tests/test_constraint.py<code_add>
@@ -87,10 +87,15 @@ def test_accurate_approximation_when_known():
n_iter=10,
)
<code_del>- params = optimizer.res[0]["params"]<code_del>
<code_del>- x, y = params['x'], params['y']<code_del>
<code_add>+ # Exclude the last sampled point, because the constraint is not fitted on that.<code_add>
<code_add>+ res = np.array([[r['target'], r['constraint'], r['params']['x'], r['params']['y']] for r in optimizer.res[:-1]])<code_add>
<code_add>+<code_add>
<code_add>+ xy = res[:, [2, 3]]<code_add>
<code_add>+ x = res[:, 2]<code_add>
<code_add>+ y = res[:, 3]<code_add>
<code_del>- assert constraint_function(x, y) == approx(conmod.approx(np.array([x, y])), rel=1e-5, abs=1e-5)<code_del>
<code_add>+ assert constraint_function(x, y) == approx(conmod.approx(xy), rel=1e-5, abs=1e-5)<code_add>
<code_add>+ assert constraint_function(x, y) == approx(optimizer.space.constraint_values[:-1], rel=1e-5, abs=1e-5)<code_add>
def test_multiple_constraints():
<commit_msg>In case of commit with the several files changed, different files are separated with 3 blank lines.<eos>
```
In case of commit with the several files changed, different files are separated with 3 blank lines. |
pvduy/sharegpt_alpaca_oa_gpt4all_vicuna_format | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 1164685190
num_examples: 581780
- name: test
num_bytes: 7267058
num_examples: 2000
download_size: 607698621
dataset_size: 1171952248
---
# Dataset Card for "sharegpt_alpaca_oa_gpt4all_vicuna_format"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Cheetor1996/Ayane_Shirakawa | ---
license: cc-by-2.0
---
|
mazkooleg/digit_mask_augmented_raw | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype: bool
splits:
- name: train
num_bytes: 58513564703.2
num_examples: 1825800
- name: test
num_bytes: 195044953.756
num_examples: 6086
- name: validation
num_bytes: 169086020.324
num_examples: 5276
download_size: 54506700314
dataset_size: 58877695677.27999
---
# Dataset Card for "digit_mask_augmented_raw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_209 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1150522552
num_examples: 224186
download_size: 1176160238
dataset_size: 1150522552
---
# Dataset Card for "chunk_209"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fun1021183/test_cvtGS3 | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 15127712.0
num_examples: 100
download_size: 15105334
dataset_size: 15127712.0
---
# Dataset Card for "test_cvtGS3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rewcifer/validation_2000_cutoff_llama-2-7b-tyellow-2k-cutoff-LR1-clean-train_first_100 | ---
dataset_info:
features:
- name: labels_and_findings
dtype: string
- name: prompts
dtype: string
- name: true_findings
dtype: string
- name: generated_texts
dtype: string
splits:
- name: train
num_bytes: 895238
num_examples: 100
download_size: 252291
dataset_size: 895238
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "validation_2000_cutoff_llama-2-7b-tyellow-2k-cutoff-LR1-clean-train_first_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mncai/ko-rag-chatbot-arena | ---
license: apache-2.0
---
|
gguichard/wsd_myriade_synth_data_id_label_test | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: tokens
sequence: string
- name: wn_sens
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 340874.41102362203
num_examples: 571
- name: test
num_bytes: 38206.588976377956
num_examples: 64
download_size: 84446
dataset_size: 379081.0
---
# Dataset Card for "wsd_myriade_synth_data_id_label_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mserras/alpaca-es-autoclean | ---
dataset_info:
features:
- name: text
dtype: 'null'
- name: inputs
struct:
- name: 1-instruction
dtype: string
- name: 2-input
dtype: string
- name: 3-output
dtype: string
- name: prediction
dtype: 'null'
- name: prediction_agent
dtype: 'null'
- name: annotation
dtype: string
- name: annotation_agent
dtype: string
- name: vectors
struct:
- name: input
sequence: float64
- name: instruction
sequence: float64
- name: output
sequence: float64
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
struct:
- name: en_index
dtype: int64
- name: tr-flag-1-instruction
dtype: bool
- name: tr-flag-2-input
dtype: bool
- name: tr-flag-3-output
dtype: bool
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
struct:
- name: text_length
dtype: int64
splits:
- name: train
num_bytes: 14035334
num_examples: 746
download_size: 10244494
dataset_size: 14035334
---
# Dataset Card for "alpaca-es-autoclean"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/VALUE_qnli_drop_aux | ---
dataset_info:
features:
- name: question
dtype: string
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 328960
num_examples: 1293
- name: test
num_bytes: 357205
num_examples: 1351
- name: train
num_bytes: 6338438
num_examples: 25360
download_size: 4425354
dataset_size: 7024603
---
# Dataset Card for "VALUE_qnli_drop_aux"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BrachioLab/supernova_timeseries | Invalid username or password. |
anjalyjayakrishnan/sample | ---
dataset_info:
features:
- name: id
dtype: string
- name: package_name
dtype: string
- name: review
dtype: string
- name: date
dtype: string
- name: star
dtype: int64
- name: version_id
dtype: int64
splits:
- name: train
num_bytes: 1508
num_examples: 5
- name: test
num_bytes: 956
num_examples: 5
download_size: 7783
dataset_size: 2464
---
# Dataset Card for "sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gregvascaino/xplebe | ---
license: openrail
---
|
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T20:38:43.252045](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down/blob/main/results_2023-10-23T20-38-43.252045.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3701761744966443,\n\
\ \"em_stderr\": 0.004944853456208216,\n \"f1\": 0.4095354446308729,\n\
\ \"f1_stderr\": 0.004845432044443532,\n \"acc\": 0.4391206057062913,\n\
\ \"acc_stderr\": 0.01050548040574193\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3701761744966443,\n \"em_stderr\": 0.004944853456208216,\n\
\ \"f1\": 0.4095354446308729,\n \"f1_stderr\": 0.004845432044443532\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12054586808188021,\n \
\ \"acc_stderr\": 0.008968608285309073\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7576953433307024,\n \"acc_stderr\": 0.012042352526174789\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|arc:challenge|25_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T20_38_43.252045
path:
- '**/details_harness|drop|3_2023-10-23T20-38-43.252045.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T20-38-43.252045.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T20_38_43.252045
path:
- '**/details_harness|gsm8k|5_2023-10-23T20-38-43.252045.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T20-38-43.252045.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hellaswag|10_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-17-27.993942.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T05-17-27.993942.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T05-17-27.993942.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T20_38_43.252045
path:
- '**/details_harness|winogrande|5_2023-10-23T20-38-43.252045.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T20-38-43.252045.parquet'
- config_name: results
data_files:
- split: 2023_10_04T05_17_27.993942
path:
- results_2023-10-04T05-17-27.993942.parquet
- split: 2023_10_23T20_38_43.252045
path:
- results_2023-10-23T20-38-43.252045.parquet
- split: latest
path:
- results_2023-10-23T20-38-43.252045.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T20:38:43.252045](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r16-q_k_v_o_gate_up_down/blob/main/results_2023-10-23T20-38-43.252045.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3701761744966443,
"em_stderr": 0.004944853456208216,
"f1": 0.4095354446308729,
"f1_stderr": 0.004845432044443532,
"acc": 0.4391206057062913,
"acc_stderr": 0.01050548040574193
},
"harness|drop|3": {
"em": 0.3701761744966443,
"em_stderr": 0.004944853456208216,
"f1": 0.4095354446308729,
"f1_stderr": 0.004845432044443532
},
"harness|gsm8k|5": {
"acc": 0.12054586808188021,
"acc_stderr": 0.008968608285309073
},
"harness|winogrande|5": {
"acc": 0.7576953433307024,
"acc_stderr": 0.012042352526174789
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CyberHarem/catapult_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of catapult/カタパルト/空爆 (Arknights)
This is the dataset of catapult/カタパルト/空爆 (Arknights), containing 13 images and their tags.
The core tags of this character are `brown_hair, animal_ears, multicolored_hair, short_hair, breasts, green_eyes, hair_between_eyes, red_hair, hair_ornament, hairclip, horse_ears, horse_girl, tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 11.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catapult_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 13 | 10.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catapult_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 28 | 20.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/catapult_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/catapult_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, smile, open_jacket, black_shorts, long_sleeves, looking_at_viewer, open_mouth, solo, black_shirt, blush, choker, midriff, belt, black_thighhighs, green_jacket, navel, short_shorts, simple_background, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | smile | open_jacket | black_shorts | long_sleeves | looking_at_viewer | open_mouth | solo | black_shirt | blush | choker | midriff | belt | black_thighhighs | green_jacket | navel | short_shorts | simple_background | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------|:---------------|:---------------|:--------------------|:-------------|:-------|:--------------|:--------|:---------|:----------|:-------|:-------------------|:---------------|:--------|:---------------|:--------------------|:-------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-104000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 664423
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Lazycuber/unalignment-airoboros-2.2 | ---
license: other
license_name: datasets
license_link: LICENSE
---
|
mangaphd/HausaLexicons | ---
license: ecl-2.0
---
|
namwooooo/embedded-things | ---
license: mit
---
|
BhabhaAI/indic-instruct-data-v0.1-filtered | ---
language:
- en
- hi
multilinguality:
- multilingual
size_categories:
- 5K<n<400K
language_bcp47:
- en-US
- hi-IN
configs:
- config_name: anudesh
data_files:
- split: en
path: anudesh/en*
- split: hi
path: anudesh/hi*
- config_name: dolly
data_files:
- split: en
path: dolly/en*
- split: hi
path: dolly/hi*
- config_name: flan_v2
data_files:
- split: en
path: flan_v2/en*
- split: hi
path: flan_v2/hi*
- config_name: hh-rlhf
data_files:
- split: en
path: hh-rlhf/en*
- split: hi
path: hh-rlhf/hi*
- config_name: nmt-seed
data_files:
- split: hi
path: nmt-seed/hi*
- config_name: wikihow
data_files:
- split: en
path: wikihow/en*
- split: hi
path: wikihow/hi*
- config_name: oasst1
data_files:
- split: en
path: oasst1/en*
- split: hi
path: oasst1/hi*
- config_name: lm_sys
data_files:
- split: en
path: lm_sys/en*
- split: hi
path: lm_sys/hi*
---
This is filtered version of [indic-instruct-data-v0.1](https://huggingface.co/datasets/ai4bharat/indic-instruct-data-v0.1).
**UPDATE: 4 March 2024** - This dataset has been further filtered to create [indic-instruct-data-v0.2-filtered](https://huggingface.co/datasets/BhabhaAI/indic-instruct-data-v0.2-filtered).
## Filtering Approach
1. Drop exampels containing ["search the web", "www.", ".py", ".com", "spanish", "french", "japanese", "given two strings, check whether one string is a rotation of another", "openai", "xml", "arrange the words", "__", "noinput" "idiom", "alphabetic", "alliteration", "translat", "paraphrase", "code", "def ", "http", "https", "index.html", "html", "python", "```", "identify the language", "word count", "number of words", "count the number", "identify the language", "spelling", "word count", " x ", " y ", "'x'", "'y'", "language"]
2. Compare English to translated Hindi words and character ratio to avoid duplicated words in translation. This drop row containing reptation of characters/words. Example: ल्लोलोलोलोलोलोलोल or न्याय की दृष्टि से न्याय की दृष्टि से न्याय की दृष्टि से न्याय की दृष्टि से
Anudesh and oasst1 dataset have been kept as it because they don't have their English counterparts to filter. |
davanstrien/ToadFishFinder | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: labels
dtype: string
splits:
- name: train
num_bytes: 2711834755.57
num_examples: 20914
download_size: 2707887140
dataset_size: 2711834755.57
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ToadFishFinder"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nev/diverse-depth | ---
license: other
---
|
MoreMemes/Image | ---
license: openrail
---
|
ehusaint/dataset-lisan-tiny | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: client_id
dtype: int64
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 516365.0
num_examples: 3
- name: test
num_bytes: 230885.0
num_examples: 1
download_size: 720756
dataset_size: 747250.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v5 | ---
pretty_name: Evaluation run of kekmodel/StopCarbon-10.7B-v5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kekmodel/StopCarbon-10.7B-v5](https://huggingface.co/kekmodel/StopCarbon-10.7B-v5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T16:25:24.948425](https://huggingface.co/datasets/open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v5/blob/main/results_2023-12-30T16-25-24.948425.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.667270432389036,\n\
\ \"acc_stderr\": 0.03161503740481807,\n \"acc_norm\": 0.6679793731390249,\n\
\ \"acc_norm_stderr\": 0.032260225407857515,\n \"mc1\": 0.5716034271725826,\n\
\ \"mc1_stderr\": 0.017323088597314747,\n \"mc2\": 0.7183713907727333,\n\
\ \"mc2_stderr\": 0.014997186929843767\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6851535836177475,\n \"acc_stderr\": 0.01357265770308495,\n\
\ \"acc_norm\": 0.7098976109215017,\n \"acc_norm_stderr\": 0.013261573677520767\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7143995220075682,\n\
\ \"acc_stderr\": 0.0045077680295901,\n \"acc_norm\": 0.8847839075881299,\n\
\ \"acc_norm_stderr\": 0.0031863002304505774\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800886,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800886\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6297872340425532,\n \"acc_stderr\": 0.03156564682236786,\n\
\ \"acc_norm\": 0.6297872340425532,\n \"acc_norm_stderr\": 0.03156564682236786\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5026455026455027,\n \"acc_stderr\": 0.02575094967813038,\n \"\
acc_norm\": 0.5026455026455027,\n \"acc_norm_stderr\": 0.02575094967813038\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n\
\ \"acc_stderr\": 0.021886178567172534,\n \"acc_norm\": 0.8193548387096774,\n\
\ \"acc_norm_stderr\": 0.021886178567172534\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644244,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644244\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37407407407407406,\n \"acc_stderr\": 0.029502861128955286,\n \
\ \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.029502861128955286\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5787037037037037,\n \"acc_stderr\": 0.033674621388960775,\n \"\
acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.033674621388960775\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \
\ \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.0230866350868414,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.0230866350868414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39776536312849164,\n\
\ \"acc_stderr\": 0.01636920497126298,\n \"acc_norm\": 0.39776536312849164,\n\
\ \"acc_norm_stderr\": 0.01636920497126298\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7839506172839507,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.7839506172839507,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4915254237288136,\n\
\ \"acc_stderr\": 0.012768401697269057,\n \"acc_norm\": 0.4915254237288136,\n\
\ \"acc_norm_stderr\": 0.012768401697269057\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.02655651947004151,\n\
\ \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.02655651947004151\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.02553843336857834,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.02553843336857834\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5716034271725826,\n\
\ \"mc1_stderr\": 0.017323088597314747,\n \"mc2\": 0.7183713907727333,\n\
\ \"mc2_stderr\": 0.014997186929843767\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.010410849775222789\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6520090978013646,\n \
\ \"acc_stderr\": 0.013120581030382134\n }\n}\n```"
repo_url: https://huggingface.co/kekmodel/StopCarbon-10.7B-v5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|arc:challenge|25_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|arc:challenge|25_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|gsm8k|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|gsm8k|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hellaswag|10_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hellaswag|10_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T16-10-07.476950.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T16-25-24.948425.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T16-25-24.948425.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- '**/details_harness|winogrande|5_2023-12-30T16-10-07.476950.parquet'
- split: 2023_12_30T16_25_24.948425
path:
- '**/details_harness|winogrande|5_2023-12-30T16-25-24.948425.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T16-25-24.948425.parquet'
- config_name: results
data_files:
- split: 2023_12_30T16_10_07.476950
path:
- results_2023-12-30T16-10-07.476950.parquet
- split: 2023_12_30T16_25_24.948425
path:
- results_2023-12-30T16-25-24.948425.parquet
- split: latest
path:
- results_2023-12-30T16-25-24.948425.parquet
---
# Dataset Card for Evaluation run of kekmodel/StopCarbon-10.7B-v5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kekmodel/StopCarbon-10.7B-v5](https://huggingface.co/kekmodel/StopCarbon-10.7B-v5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T16:25:24.948425](https://huggingface.co/datasets/open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v5/blob/main/results_2023-12-30T16-25-24.948425.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.667270432389036,
"acc_stderr": 0.03161503740481807,
"acc_norm": 0.6679793731390249,
"acc_norm_stderr": 0.032260225407857515,
"mc1": 0.5716034271725826,
"mc1_stderr": 0.017323088597314747,
"mc2": 0.7183713907727333,
"mc2_stderr": 0.014997186929843767
},
"harness|arc:challenge|25": {
"acc": 0.6851535836177475,
"acc_stderr": 0.01357265770308495,
"acc_norm": 0.7098976109215017,
"acc_norm_stderr": 0.013261573677520767
},
"harness|hellaswag|10": {
"acc": 0.7143995220075682,
"acc_stderr": 0.0045077680295901,
"acc_norm": 0.8847839075881299,
"acc_norm_stderr": 0.0031863002304505774
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800886,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800886
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6297872340425532,
"acc_stderr": 0.03156564682236786,
"acc_norm": 0.6297872340425532,
"acc_norm_stderr": 0.03156564682236786
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5026455026455027,
"acc_stderr": 0.02575094967813038,
"acc_norm": 0.5026455026455027,
"acc_norm_stderr": 0.02575094967813038
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.021886178567172534,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.021886178567172534
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644244,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.029502861128955286,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.029502861128955286
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.033674621388960775,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.033674621388960775
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.0230866350868414,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.0230866350868414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39776536312849164,
"acc_stderr": 0.01636920497126298,
"acc_norm": 0.39776536312849164,
"acc_norm_stderr": 0.01636920497126298
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7839506172839507,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.7839506172839507,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4915254237288136,
"acc_stderr": 0.012768401697269057,
"acc_norm": 0.4915254237288136,
"acc_norm_stderr": 0.012768401697269057
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.02655651947004151,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.02655651947004151
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857834,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857834
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5716034271725826,
"mc1_stderr": 0.017323088597314747,
"mc2": 0.7183713907727333,
"mc2_stderr": 0.014997186929843767
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.010410849775222789
},
"harness|gsm8k|5": {
"acc": 0.6520090978013646,
"acc_stderr": 0.013120581030382134
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tianyi0216/stylegan_data | ---
dataset_info:
features:
- name: source_img
dtype: image
- name: instruction
dtype: string
- name: target_img
dtype: image
splits:
- name: train
num_bytes: 2910811213.15
num_examples: 1995
download_size: 2964893208
dataset_size: 2910811213.15
---
# Dataset Card for "stylegan_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlekseyKorshuk/davinci-vs-lit-pairwise | ---
dataset_info:
features:
- name: davinci
dtype: string
- name: lit
dtype: string
- name: prompt
dtype: string
- name: api_prompt
dtype: string
splits:
- name: train
num_bytes: 1845380427
num_examples: 47954
download_size: 809346083
dataset_size: 1845380427
---
# Dataset Card for "davinci-vs-lit-pairwise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
unimelb-nlp/Multi-EuP | ---
license: apache-2.0
task_categories:
- text-retrieval
size_categories:
- 10K<n<100K
language:
- en
- de
- fr
- it
- es
- pl
- ro
- nl
- el
- hu
- pt
- cs
- sv
- bg
- da
- fi
- sk
- lt
- hr
- sl
- et
- lv
- mt
- ga
pretty_name: multi_eup
configs:
- config_name: default
data_files:
- split: full
path:
- "MultiEuP.csv"
---
## NOTES FOR DOWNLOAD!
1. Highly recommend downloading it via the API:
```bash
curl -X GET \
"https://datasets-server.huggingface.co/first-rows?dataset=unimelb-nlp%2FMulti-EuP&config=default&split=full"
```
2. If you are using the HuggingFace library, please follow these steps:
```bash
pip install datasets
```
```python
from datasets import load_dataset
dataset = load_dataset("unimelb-nlp/Multi-EuP", keep_default_na=False)
```
Note: It's crucial to use **keep_default_na=False** because some datasets contain 'null' values, such as qid_GA, due to the Irish (GA) debate titles not being published before it became an official EU language on 1 January 2022. Additionally, some debate text may not belong to the active 705 MEP, resulting in missing matching information.
### Dataset Description
- **Homepage:**
- **Repository:** [Multi-EuP Dataset repository](https://github.com/jrnlp/Multi-EuP)
- **Paper:** [Multi-EuP: The Multilingual European Parliament Dataset for Analysis of Bias in Information Retrieval](https://arxiv.org/pdf/2311.01870.pdf)
- **Leaderboard:** [Papers with Code leaderboard for Multi-EuP](Coming soon)
- **Point of Contact:** [Jinrui Yang](mailto:jinruiy@student.unimelb.edu.au)
### Dataset Summary
The Multi-Eup is a new multilingual benchmark dataset, comprising 22K multilingual documents collected from the European Parliament, spanning 24 languages. This dataset is designed to investigate fairness in a multilingual information retrieval (IR) context to analyze both language and demographic bias in a ranking context. It boasts an authentic multilingual corpus, featuring topics translated into all 24 languages, as well as cross-lingual relevance judgments. Furthermore, it offers rich demographic information associated with its documents, facilitating the study of demographic bias.
### Dataset statistics
| Language | ISO code | Countries where official lang. | Native Usage | Total Usage | # Docs | Words per Doc (mean/median) |
|----------|----------|--------------------------------|--------------|-------------|-------|------------------------------|
| English | EN | United Kingdom, Ireland, Malta | 13% | 51% | 7123 | 286/200 |
| German | DE | Germany, Belgium, Luxembourg | 16% | 32% | 3433 | 180/164 |
| French | FR | France, Belgium, Luxembourg | 12% | 26% | 2779 | 296/223 |
| Italian | IT | Italy | 13% | 16% | 1829 | 190/175 |
| Spanish | ES | Spain | 8% | 15% | 2371 | 232/198 |
| Polish | PL | Poland | 8% | 9% | 1841 | 155/148 |
| Romanian | RO | Romania | 5% | 5% | 794 | 186/172 |
| Dutch | NL | Netherlands, Belgium | 4% | 5% | 897 | 184/170 |
| Greek | EL | Greece, Cyprus | 3% | 4% | 707 | 209/205 |
| Hungarian| HU | Hungary | 3% | 3% | 614 | 126/128 |
| Portuguese| PT | Portugal | 2% | 3% | 1176 | 179/167 |
| Czech | CS | Czech Republic | 2% | 3% | 397 | 167/149 |
| Swedish | SV | Sweden | 2% | 3% | 531 | 175/165 |
| Bulgarian| BG | Bulgaria | 2% | 2% | 408 | 196/178 |
| Danish | DA | Denmark | 1% | 1% | 292 | 218/198 |
| Finnish | FI | Finland | 1% | 1% | 405 | 94/87 |
| Slovak | SK | Slovakia | 1% | 1% | 348 | 151/158 |
| Lithuanian| LT | Lithuania | 1% | 1% | 115 | 142/127 |
| Croatian | HR | Croatia | <1% | <1% | 524 | 183/164 |
| Slovene | SL | Slovenia | <1% | <1% | 270 | 201/163 |
| Estonian | ET | Estonia | <1% | <1% | 58 | 160/158 |
| Latvian | LV | Latvia | <1% | <1% | 89 | 111/123 |
| Maltese | MT | Malta | <1% | <1% | 178 | 117/115 |
| Irish | GA | Ireland | <1% | <1% | 33 | 198/172 |
*Table 1: Multi-EuP statistics, broken down by language: ISO language code; EU member states using the language officially; proportion of the EU population speaking the language; number of debate speech documents in Mult-EuP; and words per document (mean/median).*
## Dataset Structure
The Multi-EuP dataset contains two files, debate coprpus<https://huggingface.co/datasets/unimelb-nlp/Multi-EuP/blob/main/Debates.csv> and MEP info <https://huggingface.co/datasets/unimelb-nlp/Multi-EuP/blob/main/MEPs.csv>. The MEP id in two files can be used for alignment.
### Debate Corpus Fileds
The debate instance and attributes are displayed below. See the [Multi-EuP debate viewer](https://huggingface.co/datasets/unimelb-nlp/Multi-EuP/viewer/default/train) to explore more examples.
- `TEXT`: A string representing the content of the debate speech.
- `NAME`: A string containing the name of the MEP who presented the speech.
- `PRESIDENT`: A boolean indicating whether the MEP is the president (typically discussing procedural matters to introduce the debate).
- `MEPID`: An integer representing the unique ID of the MEP in the EU.
- `LANGUAGE`: The language ISO code of the text.
- `PARTY`: A string representing the political party of the MEP.
- `TEXTID`: A hash string serving as a unique identifier for the speech text.
- `CODICT`: An integer serving as the unique identifier for the speech text.
- `DATE`: A string indicating the date when the debate happened.
- `VOD-START`: The timestamp of the speech start.
- `VOD-END`: The timestamp of the speech end.
- `title_X`: A string representing the title in language X (e.g., `title_EN`). Note that this field might be empty for some languages, such as GA, as the EU does not publish titles in Irish (GA).
- `did`: A string representing the unique ID of the text (e.g., `doc0`, `doc1`).
- `qid_X`: A string representing the unique ID of the title in language X (e.g., `qid0#EN`).
### MEP info Fileds
The information dictionary for the 705 MEPs was constructed as follows:
- `fullName`: A string representing the full name of the MEP.
- `politicalGroup`: A string indicating the political group affiliation of the MEP.
- `id`: An integer representing the unique identifier of the MEP in the EU.
- `nationalPoliticalGroup`: A string denoting the national political group of the MEP.
- `photo`: A .jpg file containing the profile picture of the MEP.
- `nameAudio`: A .mp3 file with the pronunciation of the MEP's name.
- `gender_Wiki`: A string specifying the gender of the MEP as mentioned on Wikipedia.
- `gender_2017`: A string indicating the gender of the MEP according to europal-2017(<https://aclanthology.org/E17-1101.pdf>).
- `gender`: A string representing the MEP's gender after cross-referencing information from Wikipedia, europal-2017, and manual verification.
- `dateOfBirth_Wiki`: A string stating the date of birth of the MEP as mentioned on Wikipedia.
- `dateOfBirth_Home`: A string indicating the date of birth of the MEP as found on their homepage in the EU.
- `dateOfBirth`: A string representing the date of birth of the MEP after combining information from Wikipedia, their homepage, and manual verification.
- `placeOfBirth`: A string indicating the place of birth of the MEP as mentioned on their homepage.
- `country`: A string representing the nationality country of the MEP as mentioned on their homepage.
- `homePage`: A string providing the link to the MEP's homepage.
### Data Source
This Multi-Eup dataset was collected from European Parliament (<https://www.europarl.europa.eu/portal/en>).
#### Initial Data Collection and Normalization
The code for the EMNLP MRL version is made publicly available by Jinrui Yang, Timothy Baldwin and Trevor Cohn of The University of Melbourne at <https://github.com/jrnlp/Multi-EuP>. This research was funded by Melbourne Research Scholarship and undertaken using the LIEF HPCGPGPU Facility hosted at the University of Melbourne. This facility was established with the assistance of LIEF Grant LE170100200.
### Ethics Statement
The dataset contains publicly-available EP data that does not include personal or sensitive information, with the exception of information relating to public officeholders, e.g., the names of the active members of the European Parliament, European Council, or other official administration bodies. The collected data is licensed under the Creative Commons Attribution 4.0 International licence. <https://eur-lex.europa.eu/content/legal-notice/legal-notice.html>
### Citation Information
```
@inproceedings{yang-etal-2023-multi-eup,
title = "Multi-{E}u{P}: The Multilingual {E}uropean Parliament Dataset for Analysis of Bias in Information Retrieval",
author = "Yang, Jinrui and
Baldwin, Timothy and
Cohn, Trevor",
editor = "Ataman, Duygu",
booktitle = "Proceedings of the 3rd Workshop on Multi-lingual Representation Learning (MRL)",
month = dec,
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.mrl-1.21",
doi = "10.18653/v1/2023.mrl-1.21",
pages = "282--291",
}
```
|
wangxingjun778/test_dogs_and_cats | ---
license: apache-2.0
---
|
conceptofmind/r_stack_clean | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: max_stars_repo_path
dtype: string
- name: max_stars_repo_name
dtype: string
- name: max_stars_count
dtype: float64
- name: id
dtype: string
- name: content
dtype: string
- name: language
dtype: string
- name: word_number
dtype: int32
- name: compression_ratio
dtype: float32
- name: stop_word_ratio
dtype: float32
- name: flagged_words
dtype: float32
- name: char_repetition
dtype: float32
splits:
- name: train
num_bytes: 97691151
num_examples: 27913
download_size: 47229279
dataset_size: 97691151
---
# Dataset Card for "r_stack_clean"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kaleemWaheed/twitter_dataset_1713181311 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 29630
num_examples: 68
download_size: 15017
dataset_size: 29630
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MaralGPT/maralgpt-dataset-v0-1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 50332815
num_examples: 35117
download_size: 22605931
dataset_size: 50332815
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
---
# MaralGPT dataset v0.1
This is an alpaca-styled dataset, but our data format is now like the model _zephyr_. |
konverner/fr-address | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1399540
num_examples: 5500
download_size: 208333
dataset_size: 1399540
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "address_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BangumiBase/eromangasensei | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Eromanga-sensei
This is the image base of bangumi Eromanga-sensei, we detected 16 characters, 1936 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 732 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 39 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 34 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 11 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 33 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 302 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 51 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 15 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 81 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 166 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 34 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 257 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 30 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 7 | [Download](13/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 14 | 13 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 131 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
alisson40889/rochelle | ---
license: openrail
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.