datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
GhostDragon01/Wider_FaceSegLite_Masks | ---
license: apache-2.0
dataset_info:
features:
- name: filepaths
dtype: 'null'
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 547
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/i_401_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of i_401/伊401/伊401 (Kantai Collection)
This is the dataset of i_401/伊401/伊401 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `ponytail, brown_hair, brown_eyes, short_hair, short_ponytail, hair_ornament, hairclip`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 413.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/i_401_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 275.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/i_401_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1081 | 554.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/i_401_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 383.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/i_401_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1081 | 725.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/i_401_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/i_401_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, blush, one-piece_swimsuit_pull, open_mouth, school_swimsuit, nipples, small_breasts, solo, tan, cum_on_breasts, facial, sailor_collar, shirt_lift, looking_at_viewer, school_uniform |
| 1 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, open_mouth, sailor_collar, school_swimsuit, smile, swimsuit_under_clothes, one-piece_swimsuit, tan, school_uniform, white_background, solo_focus |
| 2 | 7 |  |  |  |  |  | 1girl, one-piece_swimsuit, open_mouth, sailor_collar, school_swimsuit, swimsuit_under_clothes, tan, solo, :d, barefoot, looking_at_viewer, chibi, full_body |
| 3 | 47 |  |  |  |  |  | 1girl, orange_sailor_collar, swimsuit_under_clothes, school_swimsuit, sleeveless_shirt, solo, tan, looking_at_viewer, simple_background, white_background, blue_one-piece_swimsuit, white_shirt, smile, bangs, cowboy_shot, open_mouth |
| 4 | 6 |  |  |  |  |  | 1girl, day, sailor_collar, school_swimsuit, sky, solo, cloud, looking_at_viewer, one-piece_swimsuit, smile, swimsuit_under_clothes, school_uniform |
| 5 | 8 |  |  |  |  |  | 1girl, solo, alternate_costume, smile, looking_at_viewer, obi, open_mouth, simple_background, white_background, blue_kimono, blush, red_kimono, tan, wide_sleeves, floral_print |
| 6 | 5 |  |  |  |  |  | 1girl, pleated_skirt, sailor_collar, serafuku, solo, kneehighs, loafers, alternate_costume, black_socks, blue_skirt, brown_footwear, day, full_body, looking_at_viewer, neckerchief, outdoors, short_sleeves, sky, standing, arms_behind_back, building, character_name, facing_away, from_behind, holding, mountain, own_hands_together, plant, sidelocks, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | one-piece_swimsuit_pull | open_mouth | school_swimsuit | nipples | small_breasts | solo | tan | cum_on_breasts | facial | sailor_collar | shirt_lift | looking_at_viewer | school_uniform | smile | swimsuit_under_clothes | one-piece_swimsuit | white_background | solo_focus | :d | barefoot | chibi | full_body | orange_sailor_collar | sleeveless_shirt | simple_background | blue_one-piece_swimsuit | white_shirt | bangs | cowboy_shot | day | sky | cloud | alternate_costume | obi | blue_kimono | red_kimono | wide_sleeves | floral_print | pleated_skirt | serafuku | kneehighs | loafers | black_socks | blue_skirt | brown_footwear | neckerchief | outdoors | short_sleeves | standing | arms_behind_back | building | character_name | facing_away | from_behind | holding | mountain | own_hands_together | plant | sidelocks |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------------|:-------------|:------------------|:----------|:----------------|:-------|:------|:-----------------|:---------|:----------------|:-------------|:--------------------|:-----------------|:--------|:-------------------------|:---------------------|:-------------------|:-------------|:-----|:-----------|:--------|:------------|:-----------------------|:-------------------|:--------------------|:--------------------------|:--------------|:--------|:--------------|:------|:------|:--------|:--------------------|:------|:--------------|:-------------|:---------------|:---------------|:----------------|:-----------|:------------|:----------|:--------------|:-------------|:-----------------|:--------------|:-----------|:----------------|:-----------|:-------------------|:-----------|:-----------------|:--------------|:--------------|:----------|:-----------|:---------------------|:--------|:------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | | X | X | | | | X | | | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | | X | X | | | X | X | | | X | | X | | | X | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 47 |  |  |  |  |  | X | | | X | X | | | X | X | | | | | X | | X | X | | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | | | X | | | X | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | X | | X | | | | X | X | | | | | X | | X | | | X | | | | | | | | X | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | | | | | X | | | | X | | X | | | | | | | | | | X | | | | | X | | | X | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
adsfsdf777/raw_datas | ---
license: mit
---
|
Madiator2011/lyoko-ultimate | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 24808769.89
num_examples: 1435
download_size: 24242906
dataset_size: 24808769.89
---
# Dataset Card for "lyoko-ultimate"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Test157t__Prima-Pastacles-7b | ---
pretty_name: Evaluation run of Test157t/Prima-Pastacles-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Test157t/Prima-Pastacles-7b](https://huggingface.co/Test157t/Prima-Pastacles-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Test157t__Prima-Pastacles-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-23T08:44:03.837845](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Prima-Pastacles-7b/blob/main/results_2024-02-23T08-44-03.837845.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6432910587215106,\n\
\ \"acc_stderr\": 0.0322986612554744,\n \"acc_norm\": 0.6460930820826905,\n\
\ \"acc_norm_stderr\": 0.03294327903196253,\n \"mc1\": 0.3990208078335373,\n\
\ \"mc1_stderr\": 0.017142825728496763,\n \"mc2\": 0.566925031206497,\n\
\ \"mc2_stderr\": 0.015309864055288426\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6203071672354948,\n \"acc_stderr\": 0.014182119866974872,\n\
\ \"acc_norm\": 0.6604095563139932,\n \"acc_norm_stderr\": 0.01383903976282017\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6728739294961164,\n\
\ \"acc_stderr\": 0.0046820489066223174,\n \"acc_norm\": 0.8582951603266281,\n\
\ \"acc_norm_stderr\": 0.003480344142139515\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.038607315993160904,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.038607315993160904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812142,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812142\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.02407869658063548,\n \
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.02407869658063548\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131137,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131137\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8293577981651377,\n \"acc_stderr\": 0.016129271025099857,\n \"\
acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.016129271025099857\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967407,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967407\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676166,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676166\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973147,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973147\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37988826815642457,\n\
\ \"acc_stderr\": 0.0162328268186785,\n \"acc_norm\": 0.37988826815642457,\n\
\ \"acc_norm_stderr\": 0.0162328268186785\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958147,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958147\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n\
\ \"acc_stderr\": 0.01274085387294983,\n \"acc_norm\": 0.4661016949152542,\n\
\ \"acc_norm_stderr\": 0.01274085387294983\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744546,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744546\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3990208078335373,\n\
\ \"mc1_stderr\": 0.017142825728496763,\n \"mc2\": 0.566925031206497,\n\
\ \"mc2_stderr\": 0.015309864055288426\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7963693764798737,\n \"acc_stderr\": 0.011317798781626918\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5504169825625473,\n \
\ \"acc_stderr\": 0.013702290047884742\n }\n}\n```"
repo_url: https://huggingface.co/Test157t/Prima-Pastacles-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|arc:challenge|25_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|gsm8k|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hellaswag|10_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T08-44-03.837845.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-23T08-44-03.837845.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- '**/details_harness|winogrande|5_2024-02-23T08-44-03.837845.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-23T08-44-03.837845.parquet'
- config_name: results
data_files:
- split: 2024_02_23T08_44_03.837845
path:
- results_2024-02-23T08-44-03.837845.parquet
- split: latest
path:
- results_2024-02-23T08-44-03.837845.parquet
---
# Dataset Card for Evaluation run of Test157t/Prima-Pastacles-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Test157t/Prima-Pastacles-7b](https://huggingface.co/Test157t/Prima-Pastacles-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Test157t__Prima-Pastacles-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-23T08:44:03.837845](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Prima-Pastacles-7b/blob/main/results_2024-02-23T08-44-03.837845.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6432910587215106,
"acc_stderr": 0.0322986612554744,
"acc_norm": 0.6460930820826905,
"acc_norm_stderr": 0.03294327903196253,
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496763,
"mc2": 0.566925031206497,
"mc2_stderr": 0.015309864055288426
},
"harness|arc:challenge|25": {
"acc": 0.6203071672354948,
"acc_stderr": 0.014182119866974872,
"acc_norm": 0.6604095563139932,
"acc_norm_stderr": 0.01383903976282017
},
"harness|hellaswag|10": {
"acc": 0.6728739294961164,
"acc_stderr": 0.0046820489066223174,
"acc_norm": 0.8582951603266281,
"acc_norm_stderr": 0.003480344142139515
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.038607315993160904,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.038607315993160904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812142,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812142
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.02407869658063548,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.02407869658063548
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131137,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131137
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099857,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099857
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967407,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967407
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676166,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676166
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973147,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973147
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37988826815642457,
"acc_stderr": 0.0162328268186785,
"acc_norm": 0.37988826815642457,
"acc_norm_stderr": 0.0162328268186785
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958147,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958147
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.01274085387294983,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.01274085387294983
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.027979823538744546,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.027979823538744546
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496763,
"mc2": 0.566925031206497,
"mc2_stderr": 0.015309864055288426
},
"harness|winogrande|5": {
"acc": 0.7963693764798737,
"acc_stderr": 0.011317798781626918
},
"harness|gsm8k|5": {
"acc": 0.5504169825625473,
"acc_stderr": 0.013702290047884742
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_PulsarAI__MetaMath-Tulpar-7b-v2-Slerp | ---
pretty_name: Evaluation run of PulsarAI/MetaMath-Tulpar-7b-v2-Slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PulsarAI/MetaMath-Tulpar-7b-v2-Slerp](https://huggingface.co/PulsarAI/MetaMath-Tulpar-7b-v2-Slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PulsarAI__MetaMath-Tulpar-7b-v2-Slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T17:55:14.434225](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__MetaMath-Tulpar-7b-v2-Slerp/blob/main/results_2023-12-09T17-55-14.434225.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.639251601749628,\n\
\ \"acc_stderr\": 0.03221647012444142,\n \"acc_norm\": 0.6389576323016398,\n\
\ \"acc_norm_stderr\": 0.03288102806405326,\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.564970662967412,\n\
\ \"mc2_stderr\": 0.015518503176886996\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6313993174061433,\n \"acc_stderr\": 0.014097810678042194,\n\
\ \"acc_norm\": 0.6561433447098977,\n \"acc_norm_stderr\": 0.013880644570156213\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6677952599083847,\n\
\ \"acc_stderr\": 0.004700413824942566,\n \"acc_norm\": 0.8516231826329417,\n\
\ \"acc_norm_stderr\": 0.0035474663103253973\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396997,\n\
\ \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396997\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113115,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113115\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n\
\ \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n\
\ \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n\
\ \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03826076324884866,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03826076324884866\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n\
\ \"acc_stderr\": 0.016495400635820084,\n \"acc_norm\": 0.41787709497206704,\n\
\ \"acc_norm_stderr\": 0.016495400635820084\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.02573885479781874,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.02573885479781874\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n\
\ \"acc_stderr\": 0.012729785386598559,\n \"acc_norm\": 0.4602346805736636,\n\
\ \"acc_norm_stderr\": 0.012729785386598559\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.02895975519682487,\n\
\ \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.02895975519682487\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6617647058823529,\n \"acc_stderr\": 0.019139943748487043,\n \
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.019139943748487043\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.564970662967412,\n\
\ \"mc2_stderr\": 0.015518503176886996\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7947908445146015,\n \"acc_stderr\": 0.011350315707462063\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.709628506444276,\n \
\ \"acc_stderr\": 0.012503592481818948\n }\n}\n```"
repo_url: https://huggingface.co/PulsarAI/MetaMath-Tulpar-7b-v2-Slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|arc:challenge|25_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|gsm8k|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hellaswag|10_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T17-55-14.434225.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T17-55-14.434225.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- '**/details_harness|winogrande|5_2023-12-09T17-55-14.434225.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T17-55-14.434225.parquet'
- config_name: results
data_files:
- split: 2023_12_09T17_55_14.434225
path:
- results_2023-12-09T17-55-14.434225.parquet
- split: latest
path:
- results_2023-12-09T17-55-14.434225.parquet
---
# Dataset Card for Evaluation run of PulsarAI/MetaMath-Tulpar-7b-v2-Slerp
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PulsarAI/MetaMath-Tulpar-7b-v2-Slerp
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PulsarAI/MetaMath-Tulpar-7b-v2-Slerp](https://huggingface.co/PulsarAI/MetaMath-Tulpar-7b-v2-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PulsarAI__MetaMath-Tulpar-7b-v2-Slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T17:55:14.434225](https://huggingface.co/datasets/open-llm-leaderboard/details_PulsarAI__MetaMath-Tulpar-7b-v2-Slerp/blob/main/results_2023-12-09T17-55-14.434225.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.639251601749628,
"acc_stderr": 0.03221647012444142,
"acc_norm": 0.6389576323016398,
"acc_norm_stderr": 0.03288102806405326,
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.564970662967412,
"mc2_stderr": 0.015518503176886996
},
"harness|arc:challenge|25": {
"acc": 0.6313993174061433,
"acc_stderr": 0.014097810678042194,
"acc_norm": 0.6561433447098977,
"acc_norm_stderr": 0.013880644570156213
},
"harness|hellaswag|10": {
"acc": 0.6677952599083847,
"acc_stderr": 0.004700413824942566,
"acc_norm": 0.8516231826329417,
"acc_norm_stderr": 0.0035474663103253973
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396997,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396997
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113115,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884866,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884866
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41787709497206704,
"acc_stderr": 0.016495400635820084,
"acc_norm": 0.41787709497206704,
"acc_norm_stderr": 0.016495400635820084
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.02573885479781874,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.02573885479781874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4602346805736636,
"acc_stderr": 0.012729785386598559,
"acc_norm": 0.4602346805736636,
"acc_norm_stderr": 0.012729785386598559
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.02895975519682487,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.02895975519682487
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.019139943748487043,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.019139943748487043
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.564970662967412,
"mc2_stderr": 0.015518503176886996
},
"harness|winogrande|5": {
"acc": 0.7947908445146015,
"acc_stderr": 0.011350315707462063
},
"harness|gsm8k|5": {
"acc": 0.709628506444276,
"acc_stderr": 0.012503592481818948
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_chargoddard__internlm2-base-7b-llama | ---
pretty_name: Evaluation run of chargoddard/internlm2-base-7b-llama
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chargoddard/internlm2-base-7b-llama](https://huggingface.co/chargoddard/internlm2-base-7b-llama)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__internlm2-base-7b-llama\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T22:11:28.111983](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__internlm2-base-7b-llama/blob/main/results_2024-01-21T22-11-28.111983.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5380339332515804,\n\
\ \"acc_stderr\": 0.03359422386201474,\n \"acc_norm\": 0.5448214536703925,\n\
\ \"acc_norm_stderr\": 0.03431835769902873,\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.015506204722834569,\n \"mc2\": 0.43232098792021034,\n\
\ \"mc2_stderr\": 0.014402330839994766\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5170648464163823,\n \"acc_stderr\": 0.014602878388536593,\n\
\ \"acc_norm\": 0.5435153583617748,\n \"acc_norm_stderr\": 0.01455594976049644\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.59061939852619,\n \
\ \"acc_stderr\": 0.004907146229347549,\n \"acc_norm\": 0.7946624178450508,\n\
\ \"acc_norm_stderr\": 0.004031225342516808\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5735849056603773,\n \"acc_stderr\": 0.03043779434298305,\n\
\ \"acc_norm\": 0.5735849056603773,\n \"acc_norm_stderr\": 0.03043779434298305\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.5664739884393064,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196156,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196156\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.451063829787234,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.451063829787234,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047732,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047732\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377563,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377563\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6225806451612903,\n\
\ \"acc_stderr\": 0.027575960723278233,\n \"acc_norm\": 0.6225806451612903,\n\
\ \"acc_norm_stderr\": 0.027575960723278233\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3399014778325123,\n \"acc_stderr\": 0.0333276906841079,\n\
\ \"acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.0333276906841079\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6767676767676768,\n \"acc_stderr\": 0.033322999210706444,\n \"\
acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.033322999210706444\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7668393782383419,\n \"acc_stderr\": 0.03051611137147601,\n\
\ \"acc_norm\": 0.7668393782383419,\n \"acc_norm_stderr\": 0.03051611137147601\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5051282051282051,\n \"acc_stderr\": 0.025349672906838653,\n\
\ \"acc_norm\": 0.5051282051282051,\n \"acc_norm_stderr\": 0.025349672906838653\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23703703703703705,\n \"acc_stderr\": 0.025928876132766118,\n \
\ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.025928876132766118\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.031566630992154156,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.031566630992154156\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7504587155963303,\n \"acc_stderr\": 0.018553897629501628,\n \"\
acc_norm\": 0.7504587155963303,\n \"acc_norm_stderr\": 0.018553897629501628\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923393,\n \"\
acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923393\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n\
\ \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.5964125560538116,\n\
\ \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969637,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969637\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968432,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968432\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6018518518518519,\n\
\ \"acc_stderr\": 0.04732332615978815,\n \"acc_norm\": 0.6018518518518519,\n\
\ \"acc_norm_stderr\": 0.04732332615978815\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6073619631901841,\n \"acc_stderr\": 0.03836740907831029,\n\
\ \"acc_norm\": 0.6073619631901841,\n \"acc_norm_stderr\": 0.03836740907831029\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.025598193686652258,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.025598193686652258\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7305236270753512,\n\
\ \"acc_stderr\": 0.015866243073215054,\n \"acc_norm\": 0.7305236270753512,\n\
\ \"acc_norm_stderr\": 0.015866243073215054\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5722543352601156,\n \"acc_stderr\": 0.026636539741116082,\n\
\ \"acc_norm\": 0.5722543352601156,\n \"acc_norm_stderr\": 0.026636539741116082\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5816993464052288,\n \"acc_stderr\": 0.02824513402438729,\n\
\ \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.02824513402438729\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n\
\ \"acc_stderr\": 0.027604689028581993,\n \"acc_norm\": 0.617363344051447,\n\
\ \"acc_norm_stderr\": 0.027604689028581993\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6450617283950617,\n \"acc_stderr\": 0.026624152478845853,\n\
\ \"acc_norm\": 0.6450617283950617,\n \"acc_norm_stderr\": 0.026624152478845853\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611324,\n \
\ \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611324\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4152542372881356,\n\
\ \"acc_stderr\": 0.012585471793400662,\n \"acc_norm\": 0.4152542372881356,\n\
\ \"acc_norm_stderr\": 0.012585471793400662\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5845588235294118,\n \"acc_stderr\": 0.02993534270787774,\n\
\ \"acc_norm\": 0.5845588235294118,\n \"acc_norm_stderr\": 0.02993534270787774\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5751633986928104,\n \"acc_stderr\": 0.019997973035458333,\n \
\ \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.019997973035458333\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.030862144921087555,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.030862144921087555\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.015506204722834569,\n \"mc2\": 0.43232098792021034,\n\
\ \"mc2_stderr\": 0.014402330839994766\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.012696531870038611\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19181197877179681,\n \
\ \"acc_stderr\": 0.010845169955294016\n }\n}\n```"
repo_url: https://huggingface.co/chargoddard/internlm2-base-7b-llama
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|arc:challenge|25_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|gsm8k|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hellaswag|10_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T22-11-28.111983.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T22-11-28.111983.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- '**/details_harness|winogrande|5_2024-01-21T22-11-28.111983.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T22-11-28.111983.parquet'
- config_name: results
data_files:
- split: 2024_01_21T22_11_28.111983
path:
- results_2024-01-21T22-11-28.111983.parquet
- split: latest
path:
- results_2024-01-21T22-11-28.111983.parquet
---
# Dataset Card for Evaluation run of chargoddard/internlm2-base-7b-llama
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chargoddard/internlm2-base-7b-llama](https://huggingface.co/chargoddard/internlm2-base-7b-llama) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__internlm2-base-7b-llama",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T22:11:28.111983](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__internlm2-base-7b-llama/blob/main/results_2024-01-21T22-11-28.111983.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5380339332515804,
"acc_stderr": 0.03359422386201474,
"acc_norm": 0.5448214536703925,
"acc_norm_stderr": 0.03431835769902873,
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834569,
"mc2": 0.43232098792021034,
"mc2_stderr": 0.014402330839994766
},
"harness|arc:challenge|25": {
"acc": 0.5170648464163823,
"acc_stderr": 0.014602878388536593,
"acc_norm": 0.5435153583617748,
"acc_norm_stderr": 0.01455594976049644
},
"harness|hellaswag|10": {
"acc": 0.59061939852619,
"acc_stderr": 0.004907146229347549,
"acc_norm": 0.7946624178450508,
"acc_norm_stderr": 0.004031225342516808
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.506578947368421,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.506578947368421,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5735849056603773,
"acc_stderr": 0.03043779434298305,
"acc_norm": 0.5735849056603773,
"acc_norm_stderr": 0.03043779434298305
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196156,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196156
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.451063829787234,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.451063829787234,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.023919984164047732,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.023919984164047732
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377563,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377563
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.027575960723278233,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.027575960723278233
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3399014778325123,
"acc_stderr": 0.0333276906841079,
"acc_norm": 0.3399014778325123,
"acc_norm_stderr": 0.0333276906841079
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.033322999210706444,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.033322999210706444
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7668393782383419,
"acc_stderr": 0.03051611137147601,
"acc_norm": 0.7668393782383419,
"acc_norm_stderr": 0.03051611137147601
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5051282051282051,
"acc_stderr": 0.025349672906838653,
"acc_norm": 0.5051282051282051,
"acc_norm_stderr": 0.025349672906838653
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.025928876132766118,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.025928876132766118
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7504587155963303,
"acc_stderr": 0.018553897629501628,
"acc_norm": 0.7504587155963303,
"acc_norm_stderr": 0.018553897629501628
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923393,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923393
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5964125560538116,
"acc_stderr": 0.03292802819330314,
"acc_norm": 0.5964125560538116,
"acc_norm_stderr": 0.03292802819330314
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969637,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969637
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.04732332615978815,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.04732332615978815
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6073619631901841,
"acc_stderr": 0.03836740907831029,
"acc_norm": 0.6073619631901841,
"acc_norm_stderr": 0.03836740907831029
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.025598193686652258,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.025598193686652258
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7305236270753512,
"acc_stderr": 0.015866243073215054,
"acc_norm": 0.7305236270753512,
"acc_norm_stderr": 0.015866243073215054
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.026636539741116082,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.026636539741116082
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.02824513402438729,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.02824513402438729
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.027604689028581993,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.027604689028581993
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6450617283950617,
"acc_stderr": 0.026624152478845853,
"acc_norm": 0.6450617283950617,
"acc_norm_stderr": 0.026624152478845853
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611324,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611324
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4152542372881356,
"acc_stderr": 0.012585471793400662,
"acc_norm": 0.4152542372881356,
"acc_norm_stderr": 0.012585471793400662
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5845588235294118,
"acc_stderr": 0.02993534270787774,
"acc_norm": 0.5845588235294118,
"acc_norm_stderr": 0.02993534270787774
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.019997973035458333,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.019997973035458333
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.030862144921087555,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.030862144921087555
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834569,
"mc2": 0.43232098792021034,
"mc2_stderr": 0.014402330839994766
},
"harness|winogrande|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.012696531870038611
},
"harness|gsm8k|5": {
"acc": 0.19181197877179681,
"acc_stderr": 0.010845169955294016
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Aggshourya/anime_test2_test | ---
license: openrail
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 263525.0
num_examples: 8
download_size: 263846
dataset_size: 263525.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
abrahamzelano/uwu | ---
license: openrail
---
|
Nazzaroth2/embedding_100_test | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: lang
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 8823
num_examples: 200
download_size: 5829
dataset_size: 8823
---
# Dataset Card for "embedding_100_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ytzi/starcoderdata-gpt2 | ---
dataset_info:
- config_name: julia
features:
- name: content
dtype: string
- name: input_ids
sequence: int32
- name: ratio_char_token
dtype: float64
- name: token_count
dtype: int64
splits:
- name: train
num_bytes: 3786710621
num_examples: 295364
download_size: 1199205457
dataset_size: 3786710621
- config_name: lua
features:
- name: content
dtype: string
- name: input_ids
sequence: int32
- name: ratio_char_token
dtype: float64
- name: token_count
dtype: int64
splits:
- name: train
num_bytes: 7979642116
num_examples: 549459
download_size: 2377951926
dataset_size: 7979642116
- config_name: python
features:
- name: content
dtype: string
- name: input_ids
sequence: int32
- name: ratio_char_token
dtype: float64
- name: token_count
dtype: int64
splits:
- name: train
num_bytes: 174067658922
num_examples: 12866649
download_size: 50581374044
dataset_size: 174067658922
- config_name: racket
features:
- name: content
dtype: string
- name: input_ids
sequence: int32
- name: ratio_char_token
dtype: float64
- name: token_count
dtype: int64
splits:
- name: train
num_bytes: 65123780
num_examples: 3688
download_size: 23524575
dataset_size: 65123780
- config_name: scheme
features:
- name: content
dtype: string
- name: input_ids
sequence: int32
- name: ratio_char_token
dtype: float64
- name: token_count
dtype: int64
splits:
- name: train
num_bytes: 585290772
num_examples: 41890
download_size: 158058450
dataset_size: 585290772
configs:
- config_name: julia
data_files:
- split: train
path: julia/train-*
- config_name: lua
data_files:
- split: train
path: lua/train-*
- config_name: python
data_files:
- split: train
path: python/train-*
- config_name: racket
data_files:
- split: train
path: racket/train-*
- config_name: scheme
data_files:
- split: train
path: scheme/train-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/4e46e0e8 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 186
num_examples: 10
download_size: 1338
dataset_size: 186
---
# Dataset Card for "4e46e0e8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DynamicSuperbPrivate/SpeakerCounting_LibrittsTrainClean100 | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: instruction
dtype: string
- name: label
dtype: string
- name: utterance 1
dtype: string
- name: utterance 2
dtype: string
- name: utterance 3
dtype: string
- name: utterance 4
dtype: string
- name: utterance 5
dtype: string
splits:
- name: train
num_bytes: 1438538131.0
num_examples: 10000
- name: validation
num_bytes: 199304545.0
num_examples: 1000
download_size: 2240435961
dataset_size: 1637842676.0
---
# Dataset Card for "SpeakerCounting_LibriTTSTrainClean100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yasirchemmakh/Moroccan_ads | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: ad
dtype: string
- name: title
dtype: string
- name: link
dtype: string
- name: channel
dtype: string
splits:
- name: train
num_bytes: 1115354
num_examples: 3992
download_size: 366806
dataset_size: 1115354
---
|
AdapterOcean/Open_Platypus_standardized_cluster_12_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1645484
num_examples: 4572
download_size: 729798
dataset_size: 1645484
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Open_Platypus_standardized_cluster_12_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Seanxh/twitter_dataset_1713209462 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 160048
num_examples: 375
download_size: 58421
dataset_size: 160048
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
azizyooni/VQA_Polysemy | ---
license: mit
dataset_info:
features:
- name: question
dtype: string
- name: image1
dtype: image
- name: image2
dtype: image
splits:
- name: train
num_bytes: 212639186.0
num_examples: 37
download_size: 212629154
dataset_size: 212639186.0
---
|
kuroneko5943/amz20 | ---
annotations_creators:
- found
language:
- en
language_creators:
- found
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: amz20
size_categories:
- 1K<n<10K
source_datasets:
- extended|amazon_us_reviews
tags:
- amazon
task_categories:
- text-classification
task_ids:
- sentiment-classification
--- |
GEM-submissions/lewtun__this-is-a-test-name__1655905032 | ---
benchmark: gem
type: prediction
submission_name: This is a test name
tags:
- evaluation
- benchmark
---
# GEM Submission
Submission name: This is a test name
|
jordanfan/processed_us_congress_117_bills_25_75_perentile | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: index
dtype: int64
- name: id
dtype: string
- name: policy_areas
dtype: string
- name: cur_summary
dtype: string
- name: cur_text
dtype: string
- name: title
dtype: string
- name: titles_official
dtype: string
- name: titles_short
dtype: string
- name: sponsor_name
dtype: string
- name: sponsor_party
dtype: string
- name: sponsor_state
dtype: string
- name: cleaned_summary
dtype: string
- name: extracted_text
dtype: string
- name: extracted_text_375
dtype: string
- name: extracted_text_750
dtype: string
- name: extracted_text_1000
dtype: string
- name: bertsum_extracted_250
dtype: string
- name: bertsum_extracted_375
dtype: string
- name: bertsum_extracted_375_1000
dtype: string
- name: bertsum_extracted_250_1000
dtype: string
- name: bertsum_extracted_375_750
dtype: string
- name: bertsum_extracted_250_750
dtype: string
- name: bertsum_extracted_375_500
dtype: string
- name: bertsum_extracted_250_500
dtype: string
- name: bertsum_extracted_375_375
dtype: string
- name: bertsum_extracted_250_375
dtype: string
- name: text_len
dtype: int64
splits:
- name: train
num_bytes: 306431904.16626763
num_examples: 5627
- name: val
num_bytes: 90766342.18742621
num_examples: 1713
- name: test
num_bytes: 13823238.76657825
num_examples: 185
download_size: 125750024
dataset_size: 411021485.1202721
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
ankhamun/c1 | ---
dataset_info:
features:
- name: prompts
dtype: string
- name: responses
dtype: string
splits:
- name: train
num_bytes: 6051897
num_examples: 2260
download_size: 3151600
dataset_size: 6051897
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
siumankwan23/file1k2 | ---
license: mit
---
|
CyberHarem/perusepone2shi_jashinchandropkick | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ペルセポネ2世
This is the dataset of ペルセポネ2世, containing 144 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 144 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 330 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 144 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 144 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 144 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 144 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 144 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 330 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 330 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 330 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
ibranze/araproje_hellaswag_en_s2 | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 149738.0
num_examples: 250
download_size: 82878
dataset_size: 149738.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_en_s2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigbio/hallmarks_of_cancer |
---
language:
- en
bigbio_language:
- English
license: gpl-3.0
multilinguality: monolingual
bigbio_license_shortname: GPL_3p0
pretty_name: Hallmarks of Cancer
homepage: https://github.com/sb895/Hallmarks-of-Cancer
bigbio_pubmed: True
bigbio_public: True
bigbio_tasks:
- TEXT_CLASSIFICATION
---
# Dataset Card for Hallmarks of Cancer
## Dataset Description
- **Homepage:** https://github.com/sb895/Hallmarks-of-Cancer
- **Pubmed:** True
- **Public:** True
- **Tasks:** TXTCLASS
The Hallmarks of Cancer (HOC) Corpus consists of 1852 PubMed publication
abstracts manually annotated by experts according to a taxonomy. The taxonomy
consists of 37 classes in a hierarchy. Zero or more class labels are assigned
to each sentence in the corpus. The labels are found under the "labels"
directory, while the tokenized text can be found under "text" directory.
The filenames are the corresponding PubMed IDs (PMID).
## Citation Information
```
@article{DBLP:journals/bioinformatics/BakerSGAHSK16,
author = {Simon Baker and
Ilona Silins and
Yufan Guo and
Imran Ali and
Johan H{"{o}}gberg and
Ulla Stenius and
Anna Korhonen},
title = {Automatic semantic classification of scientific literature
according to the hallmarks of cancer},
journal = {Bioinform.},
volume = {32},
number = {3},
pages = {432--440},
year = {2016},
url = {https://doi.org/10.1093/bioinformatics/btv585},
doi = {10.1093/bioinformatics/btv585},
timestamp = {Thu, 14 Oct 2021 08:57:44 +0200},
biburl = {https://dblp.org/rec/journals/bioinformatics/BakerSGAHSK16.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
|
tyzhu/ds_combined_try_lora_merge | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 2088.495238095238
num_examples: 20
- name: validation
num_bytes: 2088.495238095238
num_examples: 20
download_size: 5988
dataset_size: 4176.990476190476
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "ds_combined_try_lora_merge"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
adityarra07/test_ds_uwb_atc | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 133378491.71317436
num_examples: 1000
download_size: 128146292
dataset_size: 133378491.71317436
---
# Dataset Card for "test_ds_uwb_atc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_54 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1332836852
num_examples: 259711
download_size: 1362554244
dataset_size: 1332836852
---
# Dataset Card for "chunk_54"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
selimyagci/generatedMisogyny | ---
license: unknown
---
|
liuyanchen1015/MULTI_VALUE_sst2_present_perfect_ever | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 3536
num_examples: 25
- name: test
num_bytes: 9243
num_examples: 59
- name: train
num_bytes: 137625
num_examples: 1071
download_size: 75239
dataset_size: 150404
---
# Dataset Card for "MULTI_VALUE_sst2_present_perfect_ever"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Braddy/xview_captions_gt | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
sequence: string
- name: file_id
dtype: string
splits:
- name: train
num_bytes: 5117788.0
num_examples: 47
download_size: 5117884
dataset_size: 5117788.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "xview_captions_gt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jan-hq/hh_rlhf_reversed_binarized | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 312180198
num_examples: 160800
- name: test
num_bytes: 16755445
num_examples: 8552
download_size: 181796170
dataset_size: 328935643
---
# Dataset Card for "hh_rlhf_reversed_binarized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibivibiv/alpaca_tiny8 | ---
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 460187240
num_examples: 290901
download_size: 266363303
dataset_size: 460187240
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
chucklechamp26/peter-griffin | ---
license: mit
---
|
umarigan/recipe_dataset_tokenized | ---
dataset_info:
features:
- name: title
dtype: string
- name: ingredients
dtype: string
- name: directions
dtype: string
- name: image
dtype: image
- name: embeddings_image
sequence: float64
- name: embeddings_text
sequence: float64
splits:
- name: train
num_bytes: 3703247138.125
num_examples: 82303
download_size: 3678809691
dataset_size: 3703247138.125
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
aaditya/gptdetect | ---
dataset_info:
features:
- name: intro
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 4092179.2
num_examples: 4000
- name: validation
num_bytes: 511522.4
num_examples: 500
- name: test
num_bytes: 511522.4
num_examples: 500
download_size: 3165913
dataset_size: 5115224.000000001
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Lichang-Chen/800k_ift | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: prompt
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 1446099784
num_examples: 795213
- name: test
num_bytes: 76233024
num_examples: 41854
download_size: 818905443
dataset_size: 1522332808
---
# Dataset Card for "800k_ift"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
iamnguyen/ds_by_sys_prompt_0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 792945192.7634923
num_examples: 464912
download_size: 449699774
dataset_size: 792945192.7634923
---
# Dataset Card for "ds_by_sys_prompt_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yangp/chat-gvg-rings | ---
dataset_info:
features:
- name: context
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 59359950
num_examples: 7858
download_size: 20556706
dataset_size: 59359950
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "chat-gvg-rings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
michelcarroll/llama2-earnings-stock-prediction-fine-tune | ---
license: apache-2.0
dataset_info:
features:
- name: completion
dtype: string
splits:
- name: train
num_bytes: 8716152
num_examples: 10000
- name: development
num_bytes: 1754138
num_examples: 1991
- name: test
num_bytes: 1718461
num_examples: 1949
download_size: 4555053
dataset_size: 12188751
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: development
path: data/development-*
- split: test
path: data/test-*
---
|
ekazuki/french_deputies_tweet_sentiment | ---
dataset_info:
features:
- name: twitterId
dtype: string
- name: text
dtype: string
- name: hasMedia
dtype: bool
- name: date
dtype: timestamp[ns]
- name: authorId
dtype: string
- name: group
dtype: string
- name: subjects
sequence: string
splits:
- name: train
num_bytes: 750203.9423641703
num_examples: 2179
- name: test
num_bytes: 187637.05763582967
num_examples: 545
download_size: 617869
dataset_size: 937841.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
PORTULAN/glue-ptpt | ---
language:
- pt
language_creators:
- machine-generated
source_datasets:
- glue
pretty_name: GLUE-PTPT -- The General Language Understanding Evaluation benchmark translated to European Portuguese
size_categories:
- 10K<n<100K
---
# GLUE-PTPT -- The General Language Understanding Evaluation benchmark translated to European Portuguese
This dataset has been created to evaluate [Albertina PT-* models](https://huggingface.co/PORTULAN/albertina-ptpt).
If you use this dataset please cite:
@misc{rodrigues2023advancing,
title={Advancing Neural Encoding of Portuguese with Transformer Albertina PT-*},
author={João Rodrigues and Luís Gomes and João Silva and António Branco and Rodrigo Santos and Henrique Lopes Cardoso and Tomás Osório},
year={2023},
eprint={2305.06721},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
Thus far, only 4 tasks have been translated to European Portuguese:
- MRPC
- RTE
- STS-B
- WNLI
The remainder tasks will be added in the future.
See [gluebenchmark.com](https://gluebenchmark.com/) for information about the General Language Understanding Evaluation (GLUE) dataset.
|
engr-farhan/prompts | ---
license: apache-2.0
---
|
BleachNick/MIC_sampled | ---
license: other
---
|
fathyshalaby/dds | ---
dataset_info:
features:
- name: user-message
dtype: string
id: field
- name: question-rating
list:
- name: user_id
dtype: string
id: question
- name: value
dtype: int32
id: suggestion
- name: status
dtype: string
id: question
- name: question-rating-suggestion
dtype: int32
id: suggestion
- name: question-rating-suggestion-metadata
struct:
- name: type
dtype: string
id: suggestion-metadata
- name: score
dtype: float32
id: suggestion-metadata
- name: agent
dtype: string
id: suggestion-metadata
- name: response
list:
- name: user_id
dtype: string
id: question
- name: value
dtype: string
id: suggestion
- name: status
dtype: string
id: question
- name: response-suggestion
dtype: string
id: suggestion
- name: response-suggestion-metadata
struct:
- name: type
dtype: string
id: suggestion-metadata
- name: score
dtype: float32
id: suggestion-metadata
- name: agent
dtype: string
id: suggestion-metadata
- name: external_id
dtype: string
id: external_id
- name: metadata
dtype: string
id: metadata
splits:
- name: train
num_bytes: 102128
num_examples: 38
download_size: 88081
dataset_size: 102128
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AhmedSSoliman/CodeSearchNet | ---
license: ms-pl
---
|
nlplabtdtu/data-synthetic-part-2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7762711551
num_examples: 466816
download_size: 3654128693
dataset_size: 7762711551
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
seanghay/khmer_grkpp_speech | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 1089950686
num_examples: 533
download_size: 1087070422
dataset_size: 1089950686
language:
- km
pretty_name: Khmer Speech of Phnom Penh Gendarmerie
---
I do not own the dataset. This was forced aligned and published for research purposes only because Khmer is a low resource language.
Total: 4.446 hours |
brandon12333/otis.dataset | ---
license: apache-2.0
---
|
yashm/Phrases_2 | ---
license: mit
---
|
atwine/bmg | ---
license: mit
---
|
Ivaldonetto/gregvoz | ---
license: lgpl
---
|
wilderdata/gsgw-posts | ---
tags:
- not-for-all-audiences
---
This is a historical scrape of posts from the r/gaystoriesgonewild subreddit before 2023, with no data cleaning whatsoever. All possible metadata has been retained. |
freddyaboulton/new_saving_json_7 | ---
configs:
- config_name: default
data_files:
- split: train
path: '**/*.jsonl'
dataset_info:
- name: Chatbot
dtype: string
- name: Image
dtype: string
- name: Image file
dtype: Image
- label: flag
dtype: string
- label: flag
dtype: string
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
emozilla/yarn-train-tokenized-8k-mistral | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 44399670436
num_examples: 416867
download_size: 12176377159
dataset_size: 44399670436
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "yarn-train-tokenized-8k-mistral"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
codkiller0911/kotlin_code | ---
language:
- en
tags:
- kotlin
- android
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset kotlin_code
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This Dataset contains Kotlin functions with there documentation. This dataset can be useful in fine-tuning or creating new models for developing models which can generate the code documentaiton
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
approach0/PRM | ---
dataset_info:
features:
- name: src_path
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 10167869.0
num_examples: 7448
- name: test
num_bytes: 5304144.0
num_examples: 3864
download_size: 5681426
dataset_size: 15472013.0
---
# Dataset Card for "PRM"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Artificial69/sameer | ---
dataset_info:
features:
- name: conversations
dtype: 'null'
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 563
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
adamjweintraut/eli5_lfqa_best | ---
dataset_info:
features:
- name: index
dtype: int64
- name: q_id
dtype: string
- name: question
dtype: string
- name: best_answer
dtype: string
- name: all_answers
sequence: string
- name: num_answers
dtype: int64
- name: top_answers
sequence: string
- name: num_top_answers
dtype: int64
- name: context
dtype: string
- name: orig
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 2557146936.7402635
num_examples: 183333
- name: test
num_bytes: 319648597.62986815
num_examples: 22917
- name: validation
num_bytes: 319648597.62986815
num_examples: 22917
download_size: 1932532942
dataset_size: 3196444131.9999995
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
sepidpy/wine-ratings | ---
license: mit
---
|
enoahjr/twitter_dataset_1713135208 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 209871
num_examples: 585
download_size: 77016
dataset_size: 209871
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
alvarobartt/mini-capybara-100 | ---
dataset_info:
features:
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 1048305.9896866323
num_examples: 100
download_size: 575394
dataset_size: 1048305.9896866323
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
khhuang/chartve_dataset | ---
language:
- en
license: apache-2.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
tags:
- chart
- plot
- chart-to-text
- vistext
- statista
- pew
- chart-visual-entailment
- chart-understanding
- chart-captioning
- chart-summarization
- document-image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
dataset_info:
features:
- name: image
dtype: string
- name: sentence
dtype: string
- name: label
dtype: string
- name: manipulation_type
dtype: string
- name: dataset
dtype: string
splits:
- name: train
num_bytes: 118229163.0
num_examples: 522531
- name: dev
num_bytes: 9400046.0
num_examples: 36002
download_size: 51634467
dataset_size: 127629209.0
---
# Dataset Card for ChartVE's Training Data
- [Dataset Description](https://huggingface.co/datasets/khhuang/ChartVE/blob/main/README.md#dataset-description)
- [Paper Information](https://huggingface.co/datasets/khhuang/ChartVE/blob/main/README.md#paper-information)
- [Citation](https://huggingface.co/datasets/khhuang/ChartVE/blob/main/README.md#citation)
## Dataset Description
[ChartVE](https://huggingface.co/khhuang/chartve) (Chart Visual Entailment) is a visual entailment model introduced in the paper "Do LVLMs Understand Charts? Analyzing and Correcting Factual Errors in Chart Captioning" for evaluating the factuality of a generated caption sentence with regard to the input chart. The model takes in a chart figure and a caption sentence as input, and outputs an entailment probability. This repository hosts the training and validation data for ChartVE.
### Fields
Below, we illustrate the fields in each instance.
- `image`: The path to chart image. Images can be found in [image.zip](https://huggingface.co/datasets/khhuang/chartve_dataset/blob/main/images.zip).
- `sentence`: The sentence used as the _hypothesis_.
- `label`: An indicator about whether the chart entails the given `sentence`.
- `manipulation_type`: The type of perturbation that alters the original sentence (this is only applicable for non-entailment instances).
- `dataset`: The source dataset of the chart `image`.
## Paper Information
- Paper: https://arxiv.org/abs/2312.10160
- Code: https://github.com/khuangaf/CHOCOLATE/
- Project: https://khuangaf.github.io/CHOCOLATE
## Citation
If you use the **ChartVE** dataset/model in your work, please kindly cite the paper using this BibTeX:
```
@misc{huang-etal-2023-do,
title = "Do LVLMs Understand Charts? Analyzing and Correcting Factual Errors in Chart Captioning",
author = "Huang, Kung-Hsiang and
Zhou, Mingyang and
Chan, Hou Pong and
Fung, Yi R. and
Wang, Zhenhailong and
Zhang, Lingyu and
Chang, Shih-Fu and
Ji, Heng",
year={2023},
eprint={2312.10160},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
rithwik-db/processed_demo | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': neg
'1': pos
splits:
- name: train
num_bytes: 4011938.76
num_examples: 3000
- name: test
num_bytes: 391808.22
num_examples: 300
download_size: 2812982
dataset_size: 4403746.9799999995
---
# Dataset Card for "processed_demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Weyaxi__zephyr-alpha-Nebula-v2-7B | ---
pretty_name: Evaluation run of Weyaxi/zephyr-alpha-Nebula-v2-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/zephyr-alpha-Nebula-v2-7B](https://huggingface.co/Weyaxi/zephyr-alpha-Nebula-v2-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__zephyr-alpha-Nebula-v2-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T15:57:31.199945](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__zephyr-alpha-Nebula-v2-7B/blob/main/results_2023-12-04T15-57-31.199945.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5652954998411592,\n\
\ \"acc_stderr\": 0.03358458028499026,\n \"acc_norm\": 0.5715669236783297,\n\
\ \"acc_norm_stderr\": 0.03429444097066663,\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.01732308859731475,\n \"mc2\": 0.5827588614927685,\n\
\ \"mc2_stderr\": 0.015689365398538633\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5631399317406144,\n \"acc_stderr\": 0.014494421584256524,\n\
\ \"acc_norm\": 0.5861774744027304,\n \"acc_norm_stderr\": 0.014392730009221009\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6397132045409281,\n\
\ \"acc_stderr\": 0.004791024004588008,\n \"acc_norm\": 0.8305118502290381,\n\
\ \"acc_norm_stderr\": 0.003744157442536553\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.042849586397534015,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.042849586397534015\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5811320754716981,\n \"acc_stderr\": 0.030365050829115215,\n\
\ \"acc_norm\": 0.5811320754716981,\n \"acc_norm_stderr\": 0.030365050829115215\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082634,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082634\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.03268572658667492,\n\
\ \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.03268572658667492\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.38095238095238093,\n \"acc_stderr\": 0.0250107491161376,\n \"\
acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.0250107491161376\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6903225806451613,\n \"acc_stderr\": 0.026302774983517418,\n \"\
acc_norm\": 0.6903225806451613,\n \"acc_norm_stderr\": 0.026302774983517418\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162933,\n \"\
acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162933\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624335,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624335\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124498,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124498\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7512953367875648,\n \"acc_stderr\": 0.031195840877700293,\n\
\ \"acc_norm\": 0.7512953367875648,\n \"acc_norm_stderr\": 0.031195840877700293\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5717948717948718,\n \"acc_stderr\": 0.025088301454694827,\n\
\ \"acc_norm\": 0.5717948717948718,\n \"acc_norm_stderr\": 0.025088301454694827\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.03218358107742613,\n \
\ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.03218358107742613\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295813,\n \"\
acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295813\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696042,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696042\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7107843137254902,\n \"acc_stderr\": 0.031822318676475544,\n \"\
acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.031822318676475544\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460302,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460302\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n\
\ \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.5964125560538116,\n\
\ \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.042607351576445594,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.042607351576445594\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.0432704093257873,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.0432704093257873\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.025819233256483717,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.025819233256483717\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7701149425287356,\n\
\ \"acc_stderr\": 0.015046301846691812,\n \"acc_norm\": 0.7701149425287356,\n\
\ \"acc_norm_stderr\": 0.015046301846691812\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.615606936416185,\n \"acc_stderr\": 0.026189666966272035,\n\
\ \"acc_norm\": 0.615606936416185,\n \"acc_norm_stderr\": 0.026189666966272035\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n\
\ \"acc_stderr\": 0.014874252168095268,\n \"acc_norm\": 0.27150837988826815,\n\
\ \"acc_norm_stderr\": 0.014874252168095268\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.028213504177824103,\n\
\ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.028213504177824103\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.02640614597362567,\n\
\ \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.02640614597362567\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778852,\n \
\ \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778852\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.439374185136897,\n\
\ \"acc_stderr\": 0.012676014778580212,\n \"acc_norm\": 0.439374185136897,\n\
\ \"acc_norm_stderr\": 0.012676014778580212\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.030042615832714874,\n\
\ \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.030042615832714874\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5441176470588235,\n \"acc_stderr\": 0.020148939420415745,\n \
\ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.020148939420415745\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.03198761546763127,\n\
\ \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.03198761546763127\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n\
\ \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.7611940298507462,\n\
\ \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.01732308859731475,\n \"mc2\": 0.5827588614927685,\n\
\ \"mc2_stderr\": 0.015689365398538633\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7355958958168903,\n \"acc_stderr\": 0.012394724896983796\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.23881728582259287,\n \
\ \"acc_stderr\": 0.011744097081003803\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/zephyr-alpha-Nebula-v2-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|arc:challenge|25_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|gsm8k|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hellaswag|10_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T15-57-31.199945.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T15-57-31.199945.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- '**/details_harness|winogrande|5_2023-12-04T15-57-31.199945.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T15-57-31.199945.parquet'
- config_name: results
data_files:
- split: 2023_12_04T15_57_31.199945
path:
- results_2023-12-04T15-57-31.199945.parquet
- split: latest
path:
- results_2023-12-04T15-57-31.199945.parquet
---
# Dataset Card for Evaluation run of Weyaxi/zephyr-alpha-Nebula-v2-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/zephyr-alpha-Nebula-v2-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/zephyr-alpha-Nebula-v2-7B](https://huggingface.co/Weyaxi/zephyr-alpha-Nebula-v2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__zephyr-alpha-Nebula-v2-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T15:57:31.199945](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__zephyr-alpha-Nebula-v2-7B/blob/main/results_2023-12-04T15-57-31.199945.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5652954998411592,
"acc_stderr": 0.03358458028499026,
"acc_norm": 0.5715669236783297,
"acc_norm_stderr": 0.03429444097066663,
"mc1": 0.4283965728274174,
"mc1_stderr": 0.01732308859731475,
"mc2": 0.5827588614927685,
"mc2_stderr": 0.015689365398538633
},
"harness|arc:challenge|25": {
"acc": 0.5631399317406144,
"acc_stderr": 0.014494421584256524,
"acc_norm": 0.5861774744027304,
"acc_norm_stderr": 0.014392730009221009
},
"harness|hellaswag|10": {
"acc": 0.6397132045409281,
"acc_stderr": 0.004791024004588008,
"acc_norm": 0.8305118502290381,
"acc_norm_stderr": 0.003744157442536553
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.042849586397534015,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.042849586397534015
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5811320754716981,
"acc_stderr": 0.030365050829115215,
"acc_norm": 0.5811320754716981,
"acc_norm_stderr": 0.030365050829115215
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082634,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082634
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4978723404255319,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.4978723404255319,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.0250107491161376,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.0250107491161376
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.04451807959055328,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.04451807959055328
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6903225806451613,
"acc_stderr": 0.026302774983517418,
"acc_norm": 0.6903225806451613,
"acc_norm_stderr": 0.026302774983517418
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624335,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624335
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124498,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124498
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7512953367875648,
"acc_stderr": 0.031195840877700293,
"acc_norm": 0.7512953367875648,
"acc_norm_stderr": 0.031195840877700293
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5717948717948718,
"acc_stderr": 0.025088301454694827,
"acc_norm": 0.5717948717948718,
"acc_norm_stderr": 0.025088301454694827
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.03218358107742613,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.03218358107742613
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.016659279700295813,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.016659279700295813
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696042,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696042
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.031822318676475544,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.031822318676475544
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460302,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460302
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5964125560538116,
"acc_stderr": 0.03292802819330314,
"acc_norm": 0.5964125560538116,
"acc_norm_stderr": 0.03292802819330314
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724146,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724146
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.0432704093257873,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.0432704093257873
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.025819233256483717,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.025819233256483717
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7701149425287356,
"acc_stderr": 0.015046301846691812,
"acc_norm": 0.7701149425287356,
"acc_norm_stderr": 0.015046301846691812
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.615606936416185,
"acc_stderr": 0.026189666966272035,
"acc_norm": 0.615606936416185,
"acc_norm_stderr": 0.026189666966272035
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.014874252168095268,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.014874252168095268
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.028213504177824103,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.028213504177824103
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885142,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885142
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.02640614597362567,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.02640614597362567
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.029555454236778852,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.029555454236778852
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.439374185136897,
"acc_stderr": 0.012676014778580212,
"acc_norm": 0.439374185136897,
"acc_norm_stderr": 0.012676014778580212
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.030042615832714874,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.030042615832714874
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.020148939420415745,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.020148939420415745
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5183673469387755,
"acc_stderr": 0.03198761546763127,
"acc_norm": 0.5183673469387755,
"acc_norm_stderr": 0.03198761546763127
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.030147775935409217,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.030147775935409217
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4283965728274174,
"mc1_stderr": 0.01732308859731475,
"mc2": 0.5827588614927685,
"mc2_stderr": 0.015689365398538633
},
"harness|winogrande|5": {
"acc": 0.7355958958168903,
"acc_stderr": 0.012394724896983796
},
"harness|gsm8k|5": {
"acc": 0.23881728582259287,
"acc_stderr": 0.011744097081003803
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
hiroshi-matsuda-rit/filtered_mc4 | ---
pretty_name: filtered-mc4
license:
- odc-by
multilinguality:
- multilingual
---
# Dataset Card for filtered-mc4
See original [mC4 dataset](https://huggingface.co/datasets/mc4) descriptions.
You can apply any regular expression to the mC4 dataset like this:
```python
from datasets import load_dataset
dataset = load_dataset('hiroshi-matsuda-rit/filtered_mc4', 'ja', split='train', reject_patterns=[r"(セフレ|出会い?系|(?<!ユニ)セックス|ソープガイド)", r"[^\s]\ [^\s]+\ [^\s]"], max_reject_pattern_occurence=3, streaming=True)
```
### Citation Information
```
@article{2019t5,
author = {Colin Raffel and Noam Shazeer and Adam Roberts and Katherine Lee and Sharan Narang and Michael Matena and Yanqi Zhou and Wei Li and Peter J. Liu},
title = {Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer},
journal = {arXiv e-prints},
year = {2019},
archivePrefix = {arXiv},
eprint = {1910.10683},
}
```
|
danaroth/salinas | ---
license: unknown
---
# Description
This scene was collected by the 224-band [AVIRIS sensor](http://aviris.jpl.nasa.gov/) over Salinas Valley, California, and is characterized by high spatial resolution (3.7-meter pixels). The area covered comprises 512 lines by 217 samples. As with Indian Pines scene, we discarded the 20 water absorption bands, in this case bands: [108-112], [154-167], 224. This image was available only as at-sensor radiance data. It includes vegetables, bare soils, and vineyard fields. Salinas groundtruth contains 16 classes.
A small subscene of Salinas image, denoted Salinas-A, is usually used too. It comprises 86*83 pixels located within the same scene at [samples, lines] = [591-676, 158-240] and includes six classes.
# Characteristics
Groundtruth classes for the Salinas scene and their respective samples number.
| # | Class | Samples |
|----|----------------------------|---------|
| 1 | Broccoli_green_weeds_1 | 2009 |
| 2 | Broccoli_green_weeds_2 | 3726 |
| 3 | Fallow | 1976 |
| 4 | Fallow_rough_plow | 1394 |
| 5 | Fallow_smooth | 2678 |
| 6 | Stubble | 3959 |
| 7 | Celery | 3579 |
| 8 | Grapes_untrained | 11271 |
| 9 | Soil_vinyard_develop | 6203 |
| 10 | Corn_senesced_green_weeds | 3278 |
| 11 | Lettuce_romaine_4wk | 1068 |
| 12 | Lettuce_romaine_5wk | 1927 |
| 13 | Lettuce_romaine_6wk | 916 |
| 14 | Lettuce_romaine_7wk | 1070 |
| 15 | Vinyard_untrained | 7268 |
| 16 | Vinyard_vertical_trellis | 1807 |
Groundtruth classes for the Salinas-A scene and their respective samples number
| # | Class | Samples |
|---|---------------------------|---------|
| 1 | Broccoli_green_weeds_1 | 391 |
| 2 | Corn_senesced_green_weeds | 1343 |
| 3 | Lettuce_romaine_4wk | 616 |
| 4 | Lettuce_romaine_5wk | 1525 |
| 5 | Lettuce_romaine_6wk | 674 |
| 6 | Lettuce_romaine_7wk | 799 |
# Quick look
<figure>
<img src= "assets/Salinas_170.png" alt="Salinas" width="300" />
<figcaption>Sample band of Salinas dataset.</figcaption>
</figure>
<figure>
<img src= "assets/Salinas_gt.png" alt="Salinas gt" width="300" />
<figcaption>Groundtruth of Salinas dataset.</figcaption>
</figure>
<figure>
<img src= "assets/SalinasA_170.png" alt="SalinasA" width="300" />
<figcaption>Sample band of Salinas-A dataset.</figcaption>
</figure>
<figure>
<img src= "assets/SalinasA_gt.png" alt="SalinasA gt" width="300" />
<figcaption>Groundtruth of Salinas-A dataset.</figcaption>
</figure>
# Credits
This dataset was originally collected by Manuel Graña, Miguel-Angel Veganzones, Borja Ayerdi.
The original link for the dataset is available below:
https://www.ehu.eus/ccwintco/index.php/Hyperspectral_Remote_Sensing_Scenes |
GATE-engine/omniglot | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: int64
splits:
- name: full
num_bytes: 11924141.5
num_examples: 32460
download_size: 10520482
dataset_size: 11924141.5
---
# Dataset Card for "omniglot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fadliaulawi/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: comments
sequence: string
- name: created_at
dtype: int64
- name: updated_at
dtype: int64
- name: closed_at
dtype: int64
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: draft
dtype: float64
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 27666811
num_examples: 5998
download_size: 8039647
dataset_size: 27666811
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gardner/SlimOrca-Dedup-trl-conversational-chatml | ---
language:
- en
license: mit
task_categories:
- text-generation
- conversational
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: chatml
dtype: string
splits:
- name: train
num_bytes: 1220288632
num_examples: 363491
download_size: 617604809
dataset_size: 1220288632
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- chatml
- trl
- conversational
---
This dataset is contains json formatted in TRL's [conversational format](https://huggingface.co/docs/trl/main/en/sft_trainer#dataset-format-support) as well as a chatml formatted text field.
|
jlbaker361/anime_faces_dim_128_40k | ---
dataset_info:
features:
- name: image
dtype: image
- name: split
dtype: string
- name: src
dtype: string
- name: style
dtype: string
splits:
- name: train
num_bytes: 1076790828.0
num_examples: 40000
download_size: 1075448120
dataset_size: 1076790828.0
---
# Dataset Card for "anime_faces_dim_128_40k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_qa_wrong_num_v5_full_recite_full_passage_random_permute_rerun_4 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: correct_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 7715660.635846373
num_examples: 4345
- name: validation
num_bytes: 584108
num_examples: 300
download_size: 1706292
dataset_size: 8299768.635846373
---
# Dataset Card for "squad_qa_wrong_num_v5_full_recite_full_passage_random_permute_rerun_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713070417 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 13110
num_examples: 29
download_size: 8834
dataset_size: 13110
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713070417"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tiennv/gaze-following-short | ---
dataset_info:
features:
- name: image
dtype: image
- name: split
dtype: string
- name: width
dtype: int64
- name: height
dtype: int64
- name: bboxes
dtype: string
- name: labels
dtype: string
- name: cab
dtype: int64
- name: hum
dtype: int64
- name: light
dtype: float64
- name: cam
dtype: int64
- name: env
dtype: int64
- name: gaze_item
dtype: int64
- name: gazeIdx
dtype: int64
- name: gaze_cx
dtype: int64
- name: gaze_cy
dtype: int64
- name: hx
dtype: int64
- name: hy
dtype: int64
- name: pitch
dtype: float64
- name: yaw
dtype: float64
- name: roll
dtype: float64
- name: seg
dtype: string
- name: segm_gazeIdx
dtype: int64
- name: occluded
dtype: int64
splits:
- name: train
num_bytes: 501605752.0
num_examples: 869
download_size: 500172405
dataset_size: 501605752.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "gaze-following-short"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aintech/vdf_midlib-clip-avg |
---
tags:
- vdf
- vector-io
- vector-dataset
- vector-embeddings
---
This is a dataset created using [vector-io](https://github.com/ai-northstar-tech/vector-io)
|
Rhasan97/test_data | ---
license: apache-2.0
---
|
gigant/romanian_speech_synthesis_0_8_1 | ---
language:
- ro
license:
- unknown
size_categories:
ro:
- 1K<n<10K
task_categories:
- automatic-speech-recognition
task_ids: []
pretty_name: Romanian Speech Synthesis
---
## Dataset Description
- **Homepage:** https://romaniantts.com/rssdb/
- **Paper:** https://www.sciencedirect.com/science/article/abs/pii/S0167639310002074
### Dataset Summary
The Romanian speech synthesis (RSS) corpus was recorded in a hemianechoic chamber (anechoic walls and ceiling; floor partially anechoic) at the University of Edinburgh. We used three high quality studio microphones: a Neumann u89i (large diaphragm condenser), a Sennheiser MKH 800 (small diaphragm condenser with very wide bandwidth) and a DPA 4035 (headset-mounted condenser). Although the current release includes only speech data recorded via Sennheiser MKH 800, we may release speech data recorded via other microphones in the future. All recordings were made at 96 kHz sampling frequency and 24 bits per sample, then downsampled to 48 kHz sampling frequency. For recording, downsampling and bit rate conversion, we used ProTools HD hardware and software. We conducted 8 sessions over the course of a month, recording about 500 sentences in each session. At the start of each session, the speaker listened to a previously recorded sample, in order to attain a similar voice quality and intonation.
### Languages
Romanian
## Dataset Structure
### Data Instances
A typical data point comprises the path to the audio file, called audio and its sentence.
### Data Fields
- audio: A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
- sentence: The sentence the user was prompted to speak
### Data Splits
The speech material has been subdivided into portions for train and test.
The train split consists of 3180 audio clips and the related sentences.
The test split consists of 536 audio clips and the related sentences.
### Citation Information
```
@article{Stan2011442,
author = {Adriana Stan and Junichi Yamagishi and Simon King and
Matthew Aylett},
title = {The {R}omanian speech synthesis ({RSS}) corpus:
Building a high quality {HMM}-based speech synthesis
system using a high sampling rate},
journal = {Speech Communication},
volume = {53},
number = {3},
pages = {442--450},
note = {},
abstract = {This paper first introduces a newly-recorded high
quality Romanian speech corpus designed for speech
synthesis, called ''RSS'', along with Romanian
front-end text processing modules and HMM-based
synthetic voices built from the corpus. All of these
are now freely available for academic use in order to
promote Romanian speech technology research. The RSS
corpus comprises 3500 training sentences and 500 test
sentences uttered by a female speaker and was recorded
using multiple microphones at 96 kHz sampling
frequency in a hemianechoic chamber. The details of the
new Romanian text processor we have developed are also
given. Using the database, we then revisit some basic
configuration choices of speech synthesis, such as
waveform sampling frequency and auditory frequency
warping scale, with the aim of improving speaker
similarity, which is an acknowledged weakness of
current HMM-based speech synthesisers. As we
demonstrate using perceptual tests, these configuration
choices can make substantial differences to the quality
of the synthetic speech. Contrary to common practice in
automatic speech recognition, higher waveform sampling
frequencies can offer enhanced feature extraction and
improved speaker similarity for HMM-based speech
synthesis.},
doi = {10.1016/j.specom.2010.12.002},
issn = {0167-6393},
keywords = {Speech synthesis, HTS, Romanian, HMMs, Sampling
frequency, Auditory scale},
url = {http://www.sciencedirect.com/science/article/pii/S0167639310002074},
year = 2011
}
```
### Contributions
[@gigant](https://huggingface.co/gigant) added this dataset. |
Vinnyyw/Anysolos | ---
license: openrail
---
|
DmitryYarov/Test | ---
license: mit
---
|
achinthani/test-1 | ---
size_categories: n<1K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for test-1
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("achinthani/test-1")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("achinthani/test-1")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| text | Text | text | True | False |
| text_annotation | Text_annotation | text | True | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| sentiment | Sentiment | label_selection | True | N/A | ['positive', 'neutral', 'negative'] |
| mixed-emotion | Mixed-emotion | multi_label_selection | True | N/A | ['joy', 'anger', 'sadness', 'fear', 'surprise', 'love'] |
| ranking | Ranking | ranking | True | N/A | ['1', '2', '3', '4', '5'] |
| rating | Rating | rating | True | N/A | [1, 2, 3, 4, 5] |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"text": "Absolutely infuriated by the lack of accountability in our government. It\u0027s time for real change!",
"text_annotation": ""
},
"metadata": {},
"responses": [],
"suggestions": [],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": null,
"metadata": "{}",
"mixed-emotion": [],
"mixed-emotion-suggestion": null,
"mixed-emotion-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"ranking": [],
"ranking-suggestion": null,
"ranking-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"rating": [],
"rating-suggestion": null,
"rating-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"sentiment": [],
"sentiment-suggestion": null,
"sentiment-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"text": "Absolutely infuriated by the lack of accountability in our government. It\u0027s time for real change!",
"text_annotation": ""
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **text** is of type `text`.
* **text_annotation** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **sentiment** is of type `label_selection` with the following allowed values ['positive', 'neutral', 'negative'].
* **mixed-emotion** is of type `multi_label_selection` with the following allowed values ['joy', 'anger', 'sadness', 'fear', 'surprise', 'love'].
* **ranking** is of type `ranking` with the following allowed values ['1', '2', '3', '4', '5'].
* **rating** is of type `rating` with the following allowed values [1, 2, 3, 4, 5].
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **sentiment-suggestion** is of type `label_selection` with the following allowed values ['positive', 'neutral', 'negative'].
* (optional) **mixed-emotion-suggestion** is of type `multi_label_selection` with the following allowed values ['joy', 'anger', 'sadness', 'fear', 'surprise', 'love'].
* (optional) **ranking-suggestion** is of type `ranking` with the following allowed values ['1', '2', '3', '4', '5'].
* (optional) **rating-suggestion** is of type `rating` with the following allowed values [1, 2, 3, 4, 5].
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
Emotion is a dataset of English Twitter messages with six basic emotions: anger, fear, joy, love, sadness, and surprise.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
PaulineSanchez/Multi_restaurants_menus_translation | ---
dataset_info:
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- fr
splits:
- name: train
num_bytes: 56195
num_examples: 503
download_size: 36115
dataset_size: 56195
---
# Dataset Card for "Multi_restaurants_menus_translation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_49 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 43048724
num_examples: 4685
download_size: 11622494
dataset_size: 43048724
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_49"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
slnader/fcc-comments | ---
annotations_creators:
- expert-generated
language:
- en
language_creators:
- found
license:
- cc-by-nc-sa-4.0
multilinguality:
- monolingual
pretty_name: fcc-comments
size_categories:
- 10M<n<100M
source_datasets:
- original
tags:
- notice and comment
- regulation
- government
task_categories:
- text-retrieval
task_ids:
- document-retrieval
---
# Dataset Card for fcc-comments
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository: https://github.com/slnader/fcc-comments **
- **Paper: https://doi.org/10.1002/poi3.327 **
### Dataset Summary
Online comment floods during public consultations have posed unique governance challenges for
regulatory bodies seeking relevant information on proposed regulations.
How should regulatory bodies separate spam and fake comments from genuine submissions by the public,
especially when fake comments are designed to imitate ordinary citizens? How can regulatory bodies
achieve both breadth and depth in their citations to the comment corpus? What is the best way to
select comments that represent the average submission and comments that supply highly specialized
information?
`fcc-comments` is an annotated version of the comment corpus from the Federal Communications Commission's
(FCC) 2017 "Restoring Internet Freedom" proceeding. The source data were downloaded directly from the FCC's Electronic
Comment Filing System (ECFS) between January and February of 2019 and include raw comment text and metadata on
comment submissions. The comment data were processed to be in a consistent format
(machine-readable pdf or plain text), and annotated with three types of information: whether the comment was cited in the
agency's final order, the type of commenter (individual, interest group, business group), and whether the comment was associated with an in-person meeting.
The release also includes query-term and document-term matrices to facilitate keyword searches on the comment corpus.
An example of how these can be used with the bm25 algorithm can be found
[here](https://github.com/slnader/fcc-comments/blob/main/process_comments/1_score_comments.py).
## Dataset Structure
FCC relational database (fcc.pgsql): The core components of the database include a table for submission metadata,
a table for attachment metadata, a table for filer metadata, and a table that contains comment text if submitted in express format.
In addition to these core tables, there are several derived tables specific to the analyses in the paper,
including which submissions and attachments were cited in the final order, which submissions were associated with in-person meetings,
and which submissions were associated with interest groups. Full documentation of the tables can be found in fcc_database.md.
Attachments (attachments.tar.gz): Attachments to submissions that could be converted to text via OCR and saved in machine-readable pdf format.
The filenames are formatted as [submission_id]_[document_id].pdf, where submission_id and document_id are keys in the relational database.
Search datasets (search.tar.gz): Objects to facilitate prototyping of search algorithms on the comment corpus. Contains the following elements:
| Filename | description |
| ----------- | ----------- |
query_dtm.pickle | Query-term matrix (79x3986) in sparse csr format (rows are queries, columns are bigram keyword counts).
query_text.pickle | Dictionary keyed by the paragraph number in the FCC’s Notice of Proposed Rulemaking. Values are the text of the query containing a call for comments. |
search_dtms_express.pickle | Document-term matrix for express comments (3800691x3986) in sparse csr format (rows are comment pages, columns are bigram keyword counts). |
search_index_express.pickle | Pandas dataframe containing unique id and total term length for express comments. |
search_dtms.pickle | Document-term matrix for standard comment attachments (44655x3986) in sparse csr format (rows are comment pages, columns are bigram keyword counts). |
search_index.pickle | Pandas dataframe containing unique id and total term length for standard comment attachments. |
### Data Fields
The following tables are available in fcc.pgsql:
- comments: plain text comments associated with submissions
| column | type | description |
| ----------- | ----------- | ----------- |
| comment_id | character varying(64) | unique id for plain text comment |
comment_text | text | raw text of plain text comment
row_id | integer | row sequence for plain text comments
- submissions: metadata for submissions
| column | type | description |
| ----------- | ----------- | ----------- |
submission_id | character varying(20) | unique id for submission
submission_type | character varying(100) | type of submission (e.g., comment, reply, statement)
express_comment | numeric | 1 if express comment
date_received | date | date submission was received
contact_email | character varying(255) | submitter email address
city | character varying(255) | submitter city
address_line_1 | character varying(255) | submitter address line 1
address_line_2 | character varying(255) | submitter address line 2
state | character varying(255) | submitter state
zip_code | character varying(50) | submitter zip
comment_id | character varying(64) | unique id for plain text comment
- filers: names of filers associated with submissions
| column | type | description |
| ----------- | ----------- | ----------- |
submission_id | character varying(20) | unique id for submission
filer_name | character varying(250) | name of filer associated with submission
- documents: attachments associated with submissions
| column | type | description |
| ----------- | ----------- | ----------- |
submission_id | character varying(20) | unique id for submission
document_name | text | filename of attachment
download_status | numeric | status of attachment download
document_id | character varying(64) | unique id for attachment
file_extension | character varying(4) | file extension for attachment
- filers_cited: citations from final order
| column | type | description |
| ----------- | ----------- | ----------- |
point | numeric | paragraph number in final order
filer_name | character varying(250) | name of cited filer
submission_type | character varying(12) | type of submission as indicated in final order
page_numbers | text[] | cited page numbers
cite_id | integer | unique id for citation
filer_id | character varying(250) | id for cited filer
- docs_cited: attachments associated with cited submissions
| column | type | description |
| ----------- | ----------- | ----------- |
cite_id | numeric | unique id for citation
submission_id | character varying(20) | unique id for submission
document_id | character varying(64) | unique id for attachment
- near_duplicates: lookup table for comment near-duplicates
| column | type | description |
| ----------- | ----------- | ----------- |
target_document_id | unique id for target document
duplicate_document_id | unique id for duplicate of target document
- exact_duplicates: lookup table for comment exact duplicates
| column | type | description |
| ----------- | ----------- | ----------- |
target_document_id | character varying(100) | unique id for target document
duplicate_document_id | character varying(100) | unique id for duplicate of target document
- in_person_exparte: submissions associated with ex parte meeting
| column | type | description |
| ----------- | ----------- | ----------- |
submission_id | character varying(20) | unique id for submission
- interest_groups: submissions associated with interest groups
| column | type | description |
| ----------- | ----------- | ----------- |
submission_id | character varying(20) | unique id for submission
business | numeric | 1 if business group, 0 otherwise
## Dataset Creation
### Curation Rationale
The data were curated to perform information retrieval and summarization tasks as documented in https://doi.org/10.1002/poi3.327.
### Source Data
#### Initial Data Collection and Normalization
The data for this study come from the FCC's Electronic Comment Filing System (ECFS) system, accessed between January and February of 2019.
I converted the API responses into a normalized, relational database containing information on 23,951,967 submissions.
23,938,686 "express" submissions contained a single plain text comment submitted directly through the comment form.
13,821 "standard" submissions contained one or more comment documents submitted as attachments in various file formats.
While the FCC permitted any file format for attachments, I only consider documents attached in pdf, plain text, rich text,
and Microsoft Word file formats, and I drop submitted documents that were simply copies of the FCC’s official documents (e.g., the NPRM itself).
Using standard OCR software, I attempted to convert all attachments into plain text and saved them as machine-readable pdfs.
#### Who are the source language producers?
All submitters of public comments during the public comment period (but see note on fake comments in considerations).
### Annotations
#### Annotation process
- Citations: I consider citations from the main text of the FCC's final rule. I did not include citations to
supporting documents not available through ECFS (e.g., court decisions), nor did I include citations
to submissions from prior FCC proceedings. The direct citations to filed submissions are included
in a series of 1,186 footnotes. The FCC’s citation format typically followed a relatively standard
pattern: the name of the filer (e.g., Verizon), a description of the document (e.g., Comment), and
at times a page number. I extracted citations from the text using regular expressions. Based on a
random sample of paragraphs from the final order, the regular expressions identified 98% of eligible citations,
while successfully excluding all non-citation text. In total, this produced 1,886 unique citations.
I then identified which of the comments were cited. First, I identified all documents from the cited filer
that had enough pages to contain the page number cited (if provided), and, where applicable, whose filename
contained the moniker from the FCC’s citation (e.g., "Reply"). The majority of citations matched to only one
possible comment submitted, and I identified the re- maining cited comments through manual review of the citations.
In this way, I was able to tag documents associated with all but three citations. When the same cited document was
submitted under multiple separate submissions, I tagged all versions of the document as being cited.
- Commenter type: Comments are labeled as mass comments if 10 or more duplicate or near-duplicate copies were
submitted by individual commenters. Near-duplicates were defined as comments with non-zero identical information scores.
To identify the type of commenter for non-mass comments, I take advantage of the fact that the vast majority of organized
groups preferred standard submissions over express submissions. Any non-mass comment submitted as an express comment was
coded as coming from an individual. To distinguish between individuals and organizations that used standard submissions,
I use a first name and surname database from the names dataset Python package to characterize filer names as belonging to
individuals or organizations. I also use the domain of the submitter’s email address to re-categorize comments as coming
from organizations if they were submitted on behalf of organizations by an individual. Government officials were identified by
their .gov email addresses. I manually review this procedure for mischaracterizations. After obtaining a list of organization
names, I manually code each one as belonging to a business group or a non-business group. Government officials writing in
their official capacity were categorized as a non-business group.
- In-person meetings: To identify which commenters held in-person meetings with the agency, I collect all comments labeled
as an ex-parte submission in the EFCS. I manually review these submissions for mention of an in-person meeting. I label
a commenter as having held an in-person meeting if they submitted at least one ex-parte document that mentioned an in-person meeting.
#### Who are the annotators?
Annotations are a combination of automated and manual review done by the author.
### Personal and Sensitive Information
This dataset may contain personal and sensitive information, as there were no restrictions on what commenters could submit to
the agency. This dataset also contains numerous examples of profanity and spam. These comments represent what the FCC decided was
appropriate to share publicly on their own website.
## Considerations for Using the Data
### Discussion of Biases
This proceeding was famous for the large number of "fake" comments (comments impersonating ordinary citizens) submitted to the
agency (see [this report](https://ag.ny.gov/sites/default/files/oag-fakecommentsreport.pdf) by the NY AG for more information).
As such, this comment corpus contains a mix of computer-generated and natural language, and there is currently no way to reliably separate
mass comments submitted with the approval of the commenter and those submitted on behalf of the commenter without their knowledge.
## Additional Information
### Licensing Information
CreativeCommons Attribution-NonCommercial-ShareAlike 4.0 International.
### Citation Information
```
@article{handan2022,
title={Do fake online comments pose a threat to regulatory policymaking? Evidence from Internet regulation in the United States},
author={Handan-Nader, Cassandra},
journal={Policy \& Internet},
year={2022}
}
``` |
CyberHarem/syndra_leagueoflegends | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of syndra (League of Legends)
This is the dataset of syndra (League of Legends), containing 16 images and their tags.
The core tags of this character are `long_hair, breasts, purple_eyes, large_breasts, purple_hair, very_long_hair, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 16 | 20.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/syndra_leagueoflegends/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 16 | 12.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/syndra_leagueoflegends/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 31 | 20.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/syndra_leagueoflegends/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 16 | 17.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/syndra_leagueoflegends/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 31 | 27.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/syndra_leagueoflegends/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/syndra_leagueoflegends',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, solo, looking_at_viewer, thighhighs, bare_shoulders, alternate_costume, cleavage, elbow_gloves, high_heels, smile, star_guardian_(league_of_legends) |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | thighhighs | bare_shoulders | alternate_costume | cleavage | elbow_gloves | high_heels | smile | star_guardian_(league_of_legends) |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-------------|:-----------------|:--------------------|:-----------|:---------------|:-------------|:--------|:------------------------------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
|
flyingfishinwater/ultrafeedback_clean | ---
language:
- en
license: mit
task_categories:
- conversational
- text-generation
pretty_name: UltraFeedback Binarized
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
- split: train_gen
path: data/train_gen-*
- split: test_gen
path: data/test_gen-*
- split: train_prefs
path: data/train_prefs-*
- split: test_prefs
path: data/test_prefs-*
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
splits:
- name: train_sft
num_bytes: 397273717
num_examples: 61966
- name: test_sft
num_bytes: 6270496
num_examples: 1000
- name: train_gen
num_bytes: 316634390
num_examples: 61966
- name: test_gen
num_bytes: 5008220
num_examples: 1000
- name: train_prefs
num_bytes: 397273717
num_examples: 61966
- name: test_prefs
num_bytes: 12782225
num_examples: 2000
download_size: 636467735
dataset_size: 1135242765
---
# Dataset Card for UltraFeedback Cleaned
## Dataset Description
This is a cleaned version of the [HuggingFaceH4/ultrafeedback_binarized](https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized)
and was turned into jsonl format for DPO or PPO training.
I did the following clean steps:
1. Remove all lines with 'translation' or 'translate'. I believe few translation tasks are not good for fine-tuning.
2. Remove all answers starts with 'User: As an AI assistan'. It's a mistake that assistant answers have prompt.
3. Remove all lines with 'As an AI assistant, I will no]'. The prompt/anwers are malformed.
4. Remove all parts that starts with 'As an AI ... However, '. GPT likes to say that. But I prefer to make AI sounds more like human instead of machine.
5. Remove all parts that starts with 'As an AI ...' to first period. Same reason as above.
6. Remove all '</s>' in answers. Those are malformed.
If you don't like one of the steps or all steps, you can modify the python file "dpo_jsonl_formater.py" to meet your requirements and generate those jsonl files again.
## Dataset Structure
### Data Splits
The dataset has six splits, suitable for:
* Supervised fine-tuning (`sft`).
* Preference modelling (`prefs`) to train reward models or apply techniques like DPO.
* Generation ranking (`gen`) via techniques like rejection sampling or PPO.
The number of examples per split is shown as follows:
| train_sft | test_sft | train_prefs | test_prefs | train_gen | test_gen |
|:-------:|:-----------:|:-----:| :-----:| :-----:| :-----:|
| 57170 | 926 | 57170 | 1846 | 57170 | 926 |
The dataset is stored in parquet format with each entry using the following schema:
```json
{
"prompt_id": "2ebd7aee7e4da986e8a8880371e86cb7685daaa7993fc357245ff94705060e5e",
"prompt": "In a world where workplace safety is of utmost importance, there's a need for innovative training methods that can prepare employees to face hazardous scenarios...",
"score_chosen": 8.0,
"score_rejected": 7.5,
"chosen": "You have highlighted some very important aspects of using Virtual Reality (VR) technology for workplace safety training...",
"rejected": "When considering the use of virtual reality technology for safety training, several key factors should be taken into account to determine its effectiveness and suitability for a specific workplace environment..."
}
```
You should use the `chosen` and `rejected` columns for techniques like DPO, SFT or PPO.
## Citation
If you find this dataset is useful in your work, please cite the original UltraFeedback dataset: https://huggingface.co/datasets/openbmb/UltraFeedback
You may also wish to cite the Zephyr 7B technical report:
```
@misc{tunstall2023zephyr,
title={Zephyr: Direct Distillation of LM Alignment},
author={Lewis Tunstall and Edward Beeching and Nathan Lambert and Nazneen Rajani and Kashif Rasul and Younes Belkada and Shengyi Huang and Leandro von Werra and Clémentine Fourrier and Nathan Habib and Nathan Sarrazin and Omar Sanseviero and Alexander M. Rush and Thomas Wolf},
year={2023},
eprint={2310.16944},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
``` |
nps798/SampleDrugAppearanceDataset | ---
license: apache-2.0
---
|
HuggingFaceM4/VisDial_modif | Invalid username or password. |
orYx-models/Mistral-7b-LoRA-Medical-meadows-wikidoc | ---
license: apache-2.0
---
|
alexg99/captioned_flickr_faces_2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: conditioning_image
dtype: image
splits:
- name: train
num_bytes: 16639806446.161
num_examples: 40849
download_size: 16414683296
dataset_size: 16639806446.161
---
# Dataset Card for "captioned_flickr_faces_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/python3-standardized_cluster_11_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 9236546
num_examples: 7224
download_size: 0
dataset_size: 9236546
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python3-standardized_cluster_11_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lmqg/qg_squad | ---
license: cc-by-4.0
pretty_name: SQuAD for question generation
language: en
multilinguality: monolingual
size_categories: 10K<n<100K
source_datasets: squad
task_categories:
- text-generation
task_ids:
- language-modeling
tags:
- question-generation
---
# Dataset Card for "lmqg/qg_squad"
## Dataset Description
- **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
- **Paper:** [https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)
- **Point of Contact:** [Asahi Ushio](http://asahiushio.com/)
### Dataset Summary
This is a subset of [QG-Bench](https://github.com/asahi417/lm-question-generation/blob/master/QG_BENCH.md#datasets), a unified question generation benchmark proposed in
["Generative Language Models for Paragraph-Level Question Generation: A Unified Benchmark and Evaluation, EMNLP 2022 main conference"](https://arxiv.org/abs/2210.03992).
This is [SQuAD](https://rajpurkar.github.io/SQuAD-explorer/) dataset for question generation (QG) task. The split
of train/development/test set follows the ["Neural Question Generation"](https://arxiv.org/abs/1705.00106) work and is
compatible with the [leader board](https://paperswithcode.com/sota/question-generation-on-squad11).
### Supported Tasks and Leaderboards
* `question-generation`: The dataset is assumed to be used to train a model for question generation.
Success on this task is typically measured by achieving a high BLEU4/METEOR/ROUGE-L/BERTScore/MoverScore (see our paper for more in detail).
This task has an active leaderboard which can be found at [here](https://paperswithcode.com/sota/question-generation-on-squad11).
### Languages
English (en)
## Dataset Structure
An example of 'train' looks as follows.
```
{
"question": "What is heresy mainly at odds with?",
"paragraph": "Heresy is any provocative belief or theory that is strongly at variance with established beliefs or customs. A heretic is a proponent of such claims or beliefs. Heresy is distinct from both apostasy, which is the explicit renunciation of one's religion, principles or cause, and blasphemy, which is an impious utterance or action concerning God or sacred things.",
"answer": "established beliefs or customs",
"sentence": "Heresy is any provocative belief or theory that is strongly at variance with established beliefs or customs .",
"paragraph_sentence": "<hl> Heresy is any provocative belief or theory that is strongly at variance with established beliefs or customs . <hl> A heretic is a proponent of such claims or beliefs. Heresy is distinct from both apostasy, which is the explicit renunciation of one's religion, principles or cause, and blasphemy, which is an impious utterance or action concerning God or sacred things.",
"paragraph_answer": "Heresy is any provocative belief or theory that is strongly at variance with <hl> established beliefs or customs <hl>. A heretic is a proponent of such claims or beliefs. Heresy is distinct from both apostasy, which is the explicit renunciation of one's religion, principles or cause, and blasphemy, which is an impious utterance or action concerning God or sacred things.",
"sentence_answer": "Heresy is any provocative belief or theory that is strongly at variance with <hl> established beliefs or customs <hl> ."
}
```
The data fields are the same among all splits.
- `question`: a `string` feature.
- `paragraph`: a `string` feature.
- `answer`: a `string` feature.
- `sentence`: a `string` feature.
- `paragraph_answer`: a `string` feature, which is same as the paragraph but the answer is highlighted by a special token `<hl>`.
- `paragraph_sentence`: a `string` feature, which is same as the paragraph but a sentence containing the answer is highlighted by a special token `<hl>`.
- `sentence_answer`: a `string` feature, which is same as the sentence but the answer is highlighted by a special token `<hl>`.
Each of `paragraph_answer`, `paragraph_sentence`, and `sentence_answer` feature is assumed to be used to train a question generation model,
but with different information. The `paragraph_answer` and `sentence_answer` features are for answer-aware question generation and
`paragraph_sentence` feature is for sentence-aware question generation.
## Data Splits
|train|validation|test |
|----:|---------:|----:|
|75722| 10570|11877|
## Citation Information
```
@inproceedings{ushio-etal-2022-generative,
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
author = "Ushio, Asahi and
Alva-Manchego, Fernando and
Camacho-Collados, Jose",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, U.A.E.",
publisher = "Association for Computational Linguistics",
}
``` |
open-llm-leaderboard/details_NousResearch__Nous-Hermes-2-SOLAR-10.7B | ---
pretty_name: Evaluation run of NousResearch/Nous-Hermes-2-SOLAR-10.7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NousResearch/Nous-Hermes-2-SOLAR-10.7B](https://huggingface.co/NousResearch/Nous-Hermes-2-SOLAR-10.7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NousResearch__Nous-Hermes-2-SOLAR-10.7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-04T13:44:46.879799](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Nous-Hermes-2-SOLAR-10.7B/blob/main/results_2024-01-04T13-44-46.879799.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6655569621891528,\n\
\ \"acc_stderr\": 0.0315088972531857,\n \"acc_norm\": 0.6661789008110104,\n\
\ \"acc_norm_stderr\": 0.03215679079738553,\n \"mc1\": 0.3929008567931457,\n\
\ \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.5582372806316004,\n\
\ \"mc2_stderr\": 0.015330920960330282\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6305460750853242,\n \"acc_stderr\": 0.014104578366491892,\n\
\ \"acc_norm\": 0.6672354948805461,\n \"acc_norm_stderr\": 0.013769863046192304\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6576379207329217,\n\
\ \"acc_stderr\": 0.004735302937476554,\n \"acc_norm\": 0.8489344752041426,\n\
\ \"acc_norm_stderr\": 0.0035738085511685365\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.03459777606810535,\n\
\ \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.03459777606810535\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.02845015479411864,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.02845015479411864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566018,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566018\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.543859649122807,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.543859649122807,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4894179894179894,\n \"acc_stderr\": 0.025745542276045478,\n \"\
acc_norm\": 0.4894179894179894,\n \"acc_norm_stderr\": 0.025745542276045478\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n\
\ \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n\
\ \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.02931118867498311,\n\
\ \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.02931118867498311\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8787878787878788,\n \"acc_stderr\": 0.023253157951942067,\n \"\
acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.023253157951942067\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660836,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660836\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n\
\ \"acc_stderr\": 0.025845017986926913,\n \"acc_norm\": 0.8382352941176471,\n\
\ \"acc_norm_stderr\": 0.025845017986926913\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8776371308016878,\n \"acc_stderr\": 0.021331741829746786,\n\
\ \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.021331741829746786\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7488789237668162,\n\
\ \"acc_stderr\": 0.029105220833224622,\n \"acc_norm\": 0.7488789237668162,\n\
\ \"acc_norm_stderr\": 0.029105220833224622\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489122,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489122\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371812,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371812\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3486033519553073,\n\
\ \"acc_stderr\": 0.01593748465668703,\n \"acc_norm\": 0.3486033519553073,\n\
\ \"acc_norm_stderr\": 0.01593748465668703\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02380518652488813,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02380518652488813\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7746913580246914,\n \"acc_stderr\": 0.02324620264781975,\n\
\ \"acc_norm\": 0.7746913580246914,\n \"acc_norm_stderr\": 0.02324620264781975\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.500651890482399,\n\
\ \"acc_stderr\": 0.012770225252255563,\n \"acc_norm\": 0.500651890482399,\n\
\ \"acc_norm_stderr\": 0.012770225252255563\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7573529411764706,\n \"acc_stderr\": 0.026040662474201257,\n\
\ \"acc_norm\": 0.7573529411764706,\n \"acc_norm_stderr\": 0.026040662474201257\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.684640522875817,\n \"acc_stderr\": 0.018798086284886887,\n \
\ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.018798086284886887\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866764,\n\
\ \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866764\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3929008567931457,\n\
\ \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.5582372806316004,\n\
\ \"mc2_stderr\": 0.015330920960330282\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8279400157853196,\n \"acc_stderr\": 0.010607731615246996\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6944655041698257,\n \
\ \"acc_stderr\": 0.01268813407672688\n }\n}\n```"
repo_url: https://huggingface.co/NousResearch/Nous-Hermes-2-SOLAR-10.7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|arc:challenge|25_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|gsm8k|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hellaswag|10_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T13-44-46.879799.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T13-44-46.879799.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- '**/details_harness|winogrande|5_2024-01-04T13-44-46.879799.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-04T13-44-46.879799.parquet'
- config_name: results
data_files:
- split: 2024_01_04T13_44_46.879799
path:
- results_2024-01-04T13-44-46.879799.parquet
- split: latest
path:
- results_2024-01-04T13-44-46.879799.parquet
---
# Dataset Card for Evaluation run of NousResearch/Nous-Hermes-2-SOLAR-10.7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NousResearch/Nous-Hermes-2-SOLAR-10.7B](https://huggingface.co/NousResearch/Nous-Hermes-2-SOLAR-10.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NousResearch__Nous-Hermes-2-SOLAR-10.7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-04T13:44:46.879799](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__Nous-Hermes-2-SOLAR-10.7B/blob/main/results_2024-01-04T13-44-46.879799.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6655569621891528,
"acc_stderr": 0.0315088972531857,
"acc_norm": 0.6661789008110104,
"acc_norm_stderr": 0.03215679079738553,
"mc1": 0.3929008567931457,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.5582372806316004,
"mc2_stderr": 0.015330920960330282
},
"harness|arc:challenge|25": {
"acc": 0.6305460750853242,
"acc_stderr": 0.014104578366491892,
"acc_norm": 0.6672354948805461,
"acc_norm_stderr": 0.013769863046192304
},
"harness|hellaswag|10": {
"acc": 0.6576379207329217,
"acc_stderr": 0.004735302937476554,
"acc_norm": 0.8489344752041426,
"acc_norm_stderr": 0.0035738085511685365
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.03459777606810535,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.03459777606810535
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.02845015479411864,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.02845015479411864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566018,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566018
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.543859649122807,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.543859649122807,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4894179894179894,
"acc_stderr": 0.025745542276045478,
"acc_norm": 0.4894179894179894,
"acc_norm_stderr": 0.025745542276045478
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8303030303030303,
"acc_stderr": 0.02931118867498311,
"acc_norm": 0.8303030303030303,
"acc_norm_stderr": 0.02931118867498311
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.023253157951942067,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.023253157951942067
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.02931820364520686,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.02931820364520686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660836,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660836
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926913,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926913
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.021331741829746786,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.021331741829746786
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7488789237668162,
"acc_stderr": 0.029105220833224622,
"acc_norm": 0.7488789237668162,
"acc_norm_stderr": 0.029105220833224622
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489122,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489122
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371812,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371812
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3486033519553073,
"acc_stderr": 0.01593748465668703,
"acc_norm": 0.3486033519553073,
"acc_norm_stderr": 0.01593748465668703
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02380518652488813,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02380518652488813
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7746913580246914,
"acc_stderr": 0.02324620264781975,
"acc_norm": 0.7746913580246914,
"acc_norm_stderr": 0.02324620264781975
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5177304964539007,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.5177304964539007,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.500651890482399,
"acc_stderr": 0.012770225252255563,
"acc_norm": 0.500651890482399,
"acc_norm_stderr": 0.012770225252255563
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7573529411764706,
"acc_stderr": 0.026040662474201257,
"acc_norm": 0.7573529411764706,
"acc_norm_stderr": 0.026040662474201257
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.018798086284886887,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.018798086284886887
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866764,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866764
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466108,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466108
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3929008567931457,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.5582372806316004,
"mc2_stderr": 0.015330920960330282
},
"harness|winogrande|5": {
"acc": 0.8279400157853196,
"acc_stderr": 0.010607731615246996
},
"harness|gsm8k|5": {
"acc": 0.6944655041698257,
"acc_stderr": 0.01268813407672688
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_macadeliccc__polyglot-math-4x7b | ---
pretty_name: Evaluation run of macadeliccc/polyglot-math-4x7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [macadeliccc/polyglot-math-4x7b](https://huggingface.co/macadeliccc/polyglot-math-4x7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__polyglot-math-4x7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-14T01:25:55.830403](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__polyglot-math-4x7b/blob/main/results_2024-01-14T01-25-55.830403.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6367747877161951,\n\
\ \"acc_stderr\": 0.03232816338890694,\n \"acc_norm\": 0.6393383626953215,\n\
\ \"acc_norm_stderr\": 0.03297276004070419,\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965827,\n \"mc2\": 0.5378477391082209,\n\
\ \"mc2_stderr\": 0.015247687104643274\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6023890784982935,\n \"acc_stderr\": 0.014301752223279542,\n\
\ \"acc_norm\": 0.6373720136518771,\n \"acc_norm_stderr\": 0.014049106564955009\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6549492133041227,\n\
\ \"acc_stderr\": 0.004744132825391518,\n \"acc_norm\": 0.8485361481776539,\n\
\ \"acc_norm_stderr\": 0.0035776774950640874\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998905,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998905\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724053,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724053\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.03745554791462456,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.03745554791462456\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851105,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851105\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"\
acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.024321738484602354,\n\
\ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.024321738484602354\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612896,\n \"\
acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612896\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906944,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906944\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.039578354719809784,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.039578354719809784\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508283,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508283\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39664804469273746,\n\
\ \"acc_stderr\": 0.01636135476982247,\n \"acc_norm\": 0.39664804469273746,\n\
\ \"acc_norm_stderr\": 0.01636135476982247\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n\
\ \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n\
\ \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.02916312857067073,\n\
\ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.02916312857067073\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.01909422816700032,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.01909422816700032\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965827,\n \"mc2\": 0.5378477391082209,\n\
\ \"mc2_stderr\": 0.015247687104643274\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7845303867403315,\n \"acc_stderr\": 0.011555295286059282\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5663381349507203,\n \
\ \"acc_stderr\": 0.013650728047064695\n }\n}\n```"
repo_url: https://huggingface.co/macadeliccc/polyglot-math-4x7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|arc:challenge|25_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|gsm8k|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hellaswag|10_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T01-25-55.830403.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T01-25-55.830403.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- '**/details_harness|winogrande|5_2024-01-14T01-25-55.830403.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-14T01-25-55.830403.parquet'
- config_name: results
data_files:
- split: 2024_01_14T01_25_55.830403
path:
- results_2024-01-14T01-25-55.830403.parquet
- split: latest
path:
- results_2024-01-14T01-25-55.830403.parquet
---
# Dataset Card for Evaluation run of macadeliccc/polyglot-math-4x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/polyglot-math-4x7b](https://huggingface.co/macadeliccc/polyglot-math-4x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__polyglot-math-4x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T01:25:55.830403](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__polyglot-math-4x7b/blob/main/results_2024-01-14T01-25-55.830403.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6367747877161951,
"acc_stderr": 0.03232816338890694,
"acc_norm": 0.6393383626953215,
"acc_norm_stderr": 0.03297276004070419,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965827,
"mc2": 0.5378477391082209,
"mc2_stderr": 0.015247687104643274
},
"harness|arc:challenge|25": {
"acc": 0.6023890784982935,
"acc_stderr": 0.014301752223279542,
"acc_norm": 0.6373720136518771,
"acc_norm_stderr": 0.014049106564955009
},
"harness|hellaswag|10": {
"acc": 0.6549492133041227,
"acc_stderr": 0.004744132825391518,
"acc_norm": 0.8485361481776539,
"acc_norm_stderr": 0.0035776774950640874
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724053,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724053
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03745554791462456,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03745554791462456
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851105,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851105
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.024321738484602354,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.024321738484602354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608301,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608301
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612896,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612896
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906944,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906944
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809784,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809784
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508283,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39664804469273746,
"acc_stderr": 0.01636135476982247,
"acc_norm": 0.39664804469273746,
"acc_norm_stderr": 0.01636135476982247
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.02916312857067073,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.02916312857067073
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.01909422816700032,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.01909422816700032
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965827,
"mc2": 0.5378477391082209,
"mc2_stderr": 0.015247687104643274
},
"harness|winogrande|5": {
"acc": 0.7845303867403315,
"acc_stderr": 0.011555295286059282
},
"harness|gsm8k|5": {
"acc": 0.5663381349507203,
"acc_stderr": 0.013650728047064695
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Kant1/French_Wikipedia_articles | ---
task_categories:
- text-generation
language:
- fr
---
Dump of 2023-08-20 of all french article in wikipedia
https://dumps.wikimedia.org/frwiki/20230820/frwiki-20230820-pages-articles.xml.bz2 |
CATIE-AQ/piaf_fr_prompt_qa | ---
language:
- fr
license: mit
size_categories:
- 100K<n<1M
task_categories:
- question-answering
tags:
- DFP
- french prompts
annotations_creators:
- found
language_creators:
- found
multilinguality:
- monolingual
source_datasets:
- etalab-ia/piaf
---
# piaf_fr_prompt_qa
## Summary
**piaf_fr_prompt_qa** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP).
It contains **387,408** rows that can be used for a question-answering task.
The original data (without prompts) comes from the dataset [PIAF](https://huggingface.co/datasets/etalab-ia/piaf) and was augmented by questions in SQUAD 2.0 format in the [FrenchQA]( https://huggingface.co/datasets/CATIE-AQ/frenchQA) dataset.
A list of prompts (see below) was then applied in order to build the input and target columns and thus obtain the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al.
## Prompts used
### List
42 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement.
```
# SQUAD 1.0 format
'Question : "'+question+'"\nContexte : "'+context+'" Réponse :',
'La réponse à la question "'+question+'" se trouve dans "'+context+'" Pouvez-vous me la dire ?',
'La réponse à la question "'+question+'" se trouve dans "'+context+'" Peux-tu me la dire ?',
'Extraire la réponse à la question à partir du contexte suivant.\n Question : "'+question+'" Contexte : "'+context+'"',
'Extrais la réponse à la question à partir du contexte suivant.\n Question : "'+question+'" Contexte : "'+context+'"',
'Extrayez la réponse à la question à partir du contexte suivant.\n Question : "'+question+'" Contexte : "'+context+'"',
'Étant donné le passage suivant : "'+context+'"\n Répondre à la question suivante sachant que la réponse est présente dans le texte.\n Question : "'+question+'"',
'Étant donné le passage suivant : "'+context+'"\n Réponds à la question suivante sachant que la réponse est présente dans le texte.\n Question : "'+question+'"',
'Étant donné le passage suivant : "'+context+'"\n Répondez à la question suivante sachant que la réponse est présente dans le texte.\n Question : "'+question+'"',
"""La réponse à la question : " """+question+""" " se trouve dans le texte : " """+context+""" "\n Peux-tu l'indiquer ?""",
"""La réponse à la question : " """+question+""" " se trouve dans le texte : " """+context+""" "\n Pouvez-vous l'indiquer ?""",
"""La réponse à la question : " """+question+""" " se trouve dans le texte : " """+context+""" "\n Qu'elle est-elle ?""",
# SQUAD 2.0 format
'"'+question+'"\n Répondre à la question ci-dessus en se basant sur le contexte suivant : "'+context+'"\n Si vous ne trouvez pas la réponse, répondre "sans réponse".',
'"'+question+'"\n Réponds à la question ci-dessus en te basant sur le contexte suivant : "'+context+'"\n Si tu ne trouves pas la réponse, répondre "sans réponse".',
'"'+question+'"\n Répondez à la question ci-dessus en vous basant sur le contexte suivant : "'+context+'"\n Si vous ne trouvez pas la réponse, répondre "sans réponse".',
'Utiliser le texte suivant pour répondre à la question : '+question+ '\n\n "'+context+'"\n Si vous ne trouvez pas la réponse, répondre "sans réponse".',
'Utilise le texte suivant pour répondre à la question : '+question+ '\n\n "'+context+'"\n Si tu ne trouves pas la réponse, répondre "sans réponse".',
'Utilisez le texte suivant pour répondre à la question : '+question+ '\n\n "'+context+'"\n Si vous ne trouvez pas la réponse, répondre "sans réponse".',
'Lire le texte suivant et extraire la réponse à la question : "'+question+'"\n\n "'+context+'"\n Si vous ne trouvez pas la réponse, répondre "sans réponse".',
'Lis le texte suivant et extrais la réponse à la question : "'+question+'"\n\n "'+context+'"\n Si tu ne trouves pas la réponse, répondre "sans réponse".',
'Lisez le texte suivant et extrayez la réponse à la question : "'+question+'"\n\n "'+context+'"\n Si vous ne trouvez pas la réponse, répondre "sans réponse".',
'"'+context+'"\n\nSur la base du texte ci-dessus, répondre correctement à la question suivante : \n\n "'+question+'"\n Si vous ne trouvez pas la réponse, répondre "sans réponse".',
'"'+context+'"\n\nSur la base du texte ci-dessus, réponds correctement à la question suivante : \n\n "'+question+'"\n Si tu ne trouves pas la réponse, répondre "sans réponse".',
'"'+context+'"\n\nSur la base du texte ci-dessus, répondez répondre correctement à la question suivante : \n\n "'+question+'"\n Si vous ne trouvez pas la réponse, répondre "sans réponse".',
'Contexte : '+ context +'\n Compte tenu du texte ci-dessus, répondre correctement à la question suivante : "'+question+'"\n Si vous ne trouvez pas la réponse, répondre "sans réponse".',
'Contexte : '+ context +'\n Compte tenu du texte ci-dessus, réponds correctement à la question suivante : "'+question+'"\n Si tu ne trouves pas la réponse, répondre "sans réponse".',
'Contexte : '+ context +'\n Compte tenu du texte ci-dessus, répondez correctement à la question suivante : "'+question+'"\n Si vous ne trouvez pas la réponse, répondre "sans réponse".',
'"'+context+'"\n Extraire du passage la réponse à la question suivante : "'+question+'"\n Si vous ne trouvez pas la réponse, répondre "sans réponse".',
'"'+context+'"\n Extrais du passage la réponse à la question suivante : "'+question+'"\n Si tu ne trouves pas la réponse, répondre "sans réponse".',
'"'+context+'"\n Extrayez du passage la réponse à la question suivante : "'+question+'"\n Si vous ne trouvez pas la réponse, répondre "sans réponse".',
'Compte tenu du passage suivant, répondre à la question qui suit : "'+context+'"\n "'+question+'"\n Si vous ne trouvez pas la réponse, répondre "sans réponse".',
'Compte tenu du passage suivant, réponds à la question qui suit : "'+context+'"\n "'+question+'"\n Si tu ne trouves pas la réponse, répondre "sans réponse".',
'Compte tenu du passage suivant, répondez à la question qui suit : "'+context+'"\n "'+question+'"\n Si vous ne trouvez pas la réponse, répondre "sans réponse".',
'Après avoir lu le paragraphe, répondre à la question suivante : "'+context+'"\n "'+question+'"\n Si vous ne trouvez pas la réponse, répondre "sans réponse".',
'Après avoir lu le paragraphe, réponds à la question suivante : "'+context+'"\n "'+question+'"\n Si tu ne trouves pas la réponse, répondre "sans réponse".',
'Après avoir lu le paragraphe, répondez à la question suivante : "'+context+'"\n "'+question+'"\n Si vous ne trouvez pas la réponse, répondre "sans réponse".',
'Se référer au passage ci-dessous et répondre à la question suivante:\n Passage : "'+context+'"Question : "'+question+'"\n Si vous ne trouvez pas la réponse, répondre "sans réponse".',
'Référe-toi au passage ci-dessous et réponds à la question suivante:\n Passage : "'+context+'"Question : "'+question+'"\n Si tu ne trouves pas la réponse, répondre "sans réponse".',
'Référez-vous au passage ci-dessous et répondez à la question suivante:\n Passage : "'+context+'"Question : "'+question+'"\n Si vous ne trouvez pas la réponse, répondre "sans réponse".',
'Lire le passage suivant et répondez à la question qui suit : \n "'+context+'"\n "'+question+'"\n Si vous ne trouvez pas la réponse, répondre "sans réponse".',
'Lis le passage suivant et répondez à la question qui suit : \n "'+context+'"\n "'+question+'"\n Si tu ne trouves pas la réponse, répondre "sans réponse".',
'Lisez le passage suivant et répondez à la question qui suit : \n "'+context+'"\n "'+question+'"\n Si vous ne trouvez pas la réponse, répondre "sans réponse".',
```
# Splits
- `train` with 387,408 samples
- no `valid` split
- no `test` split
# How to use?
```
from datasets import load_dataset
dataset = load_dataset("CATIE-AQ/piaf_fr_prompt_qa")
```
# Citation
## Original data
> @InProceedings{keraron-EtAl:2020:LREC,
author = {Keraron, Rachel and Lancrenon, Guillaume and Bras, Mathilde and Allary, Frédéric and Moyse, Gilles and Scialom, Thomas and Soriano-Morales, Edmundo-Pavel and Staiano, Jacopo},
title = {Project PIAF: Building a Native French Question-Answering Dataset},
booktitle = {Proceedings of The 12th Language Resources and Evaluation Conference},
month = {May},
year = {2020},
address = {Marseille, France},
publisher = {European Language Resources Association},
pages = {5483--5492},
url = {https://www.aclweb.org/anthology/2020.lrec-1.673}
}
## This Dataset
> @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023,
author = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { DFP (Revision 1d24c09) },
year = 2023,
url = { https://huggingface.co/datasets/CATIE-AQ/DFP },
doi = { 10.57967/hf/1200 },
publisher = { Hugging Face }
}
## License
MIT |
sokusha/aicg | ---
viewer: false
---
# aicg
thank you [tmpupload](https://huggingface.co/datasets/tmpupload/aicg) for files up to 2024-01-03-0100 <3 |
qgallouedec/prj_gia_dataset_metaworld_plate_slide_back_v2_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the plate-slide-back-v2 environment, sample for the policy plate-slide-back-v2
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
## Load dataset
First, clone it with
```sh
git clone https://huggingface.co/datasets/qgallouedec/prj_gia_dataset_metaworld_plate_slide_back_v2_1111
```
Then, load it with
```python
import numpy as np
dataset = np.load("prj_gia_dataset_metaworld_plate_slide_back_v2_1111/dataset.npy", allow_pickle=True).item()
print(dataset.keys()) # dict_keys(['observations', 'actions', 'dones', 'rewards'])
```
|
Atila2251/Juh | ---
license: openrail
---
|
WinTaoWang/bioinspired-llama2-1k | ---
dataset_info:
features:
- name: 'Unnamed: 0.1'
dtype: int64
- name: 'Unnamed: 0'
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 1354171
num_examples: 1000
download_size: 611895
dataset_size: 1354171
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-project-xsum-8dc1621c-12925736 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: google/pegasus-cnn_dailymail
metrics: ['bleu']
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/pegasus-cnn_dailymail
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@xarymast](https://huggingface.co/xarymast) for evaluating this model. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.