datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
kpriyanshu256/MultiTabQA-multitable_pretraining-train-v2-2500 | ---
dataset_info:
features:
- name: tables
sequence: string
- name: table_names
sequence: string
- name: query
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: source_latex
dtype: string
- name: target_latex
dtype: string
- name: source_html
dtype: string
- name: target_html
dtype: string
- name: source_markdown
dtype: string
- name: target_markdown
dtype: string
splits:
- name: train
num_bytes: 16558753323
num_examples: 2500
download_size: 3252844976
dataset_size: 16558753323
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_sst2_double_past | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 1049
num_examples: 7
- name: test
num_bytes: 2699
num_examples: 19
- name: train
num_bytes: 52905
num_examples: 427
download_size: 25213
dataset_size: 56653
---
# Dataset Card for "MULTI_VALUE_sst2_double_past"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/penelope_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of penelope/ペネロピ/佩内洛珀 (Azur Lane)
This is the dataset of penelope/ペネロピ/佩内洛珀 (Azur Lane), containing 39 images and their tags.
The core tags of this character are `breasts, purple_eyes, hair_over_one_eye, purple_hair, large_breasts, short_hair, braid, bangs, maid_headdress`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 39 | 56.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/penelope_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 39 | 28.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/penelope_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 84 | 58.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/penelope_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 39 | 47.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/penelope_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 84 | 89.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/penelope_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/penelope_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, looking_at_viewer, solo, white_background, simple_background, blush, collarbone, fur_trim, upper_body, medium_hair, off_shoulder, closed_mouth |
| 1 | 5 |  |  |  |  |  | 1girl, bare_shoulders, hair_flower, looking_at_viewer, official_alternate_costume, red_flower, solo, cleavage, red_dress, nightgown, wide_sleeves, barefoot, carpet, chinese_clothes, full_body, indoors, paper_fan, simple_background, white_background, yokozuwari |
| 2 | 11 |  |  |  |  |  | 1girl, bare_shoulders, solo, cleavage, detached_sleeves, looking_at_viewer, apron, maid, official_alternate_costume, black_dress, detached_collar, single_thighhigh, white_gloves, white_thighhighs, long_sleeves, medium_hair, black_footwear, indoors, pink_eyes, pink_hair |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | cleavage | looking_at_viewer | solo | white_background | simple_background | blush | collarbone | fur_trim | upper_body | medium_hair | off_shoulder | closed_mouth | hair_flower | official_alternate_costume | red_flower | red_dress | nightgown | wide_sleeves | barefoot | carpet | chinese_clothes | full_body | indoors | paper_fan | yokozuwari | detached_sleeves | apron | maid | black_dress | detached_collar | single_thighhigh | white_gloves | white_thighhighs | long_sleeves | black_footwear | pink_eyes | pink_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-----------|:--------------------|:-------|:-------------------|:--------------------|:--------|:-------------|:-----------|:-------------|:--------------|:---------------|:---------------|:--------------|:-----------------------------|:-------------|:------------|:------------|:---------------|:-----------|:---------|:------------------|:------------|:----------|:------------|:-------------|:-------------------|:--------|:-------|:--------------|:------------------|:-------------------|:---------------|:-------------------|:---------------|:-----------------|:------------|:------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | X | X | X | X | | | | | | | X | | | | X | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_Sao10K__Venomia-1.1-m7 | ---
pretty_name: Evaluation run of Sao10K/Venomia-1.1-m7
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Venomia-1.1-m7](https://huggingface.co/Sao10K/Venomia-1.1-m7) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Venomia-1.1-m7\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-12T06:19:54.268543](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Venomia-1.1-m7/blob/main/results_2023-12-12T06-19-54.268543.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5646939992249473,\n\
\ \"acc_stderr\": 0.03365626984219371,\n \"acc_norm\": 0.5687314107018308,\n\
\ \"acc_norm_stderr\": 0.03434566057163248,\n \"mc1\": 0.31946144430844553,\n\
\ \"mc1_stderr\": 0.0163226441829605,\n \"mc2\": 0.47211906040617,\n\
\ \"mc2_stderr\": 0.015511830880546584\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5563139931740614,\n \"acc_stderr\": 0.014518421825670442,\n\
\ \"acc_norm\": 0.5844709897610921,\n \"acc_norm_stderr\": 0.014401366641216384\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6445927106154152,\n\
\ \"acc_stderr\": 0.004776583530909567,\n \"acc_norm\": 0.8304122684724159,\n\
\ \"acc_norm_stderr\": 0.0037450326672282797\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5547169811320755,\n \"acc_stderr\": 0.03058805297427065,\n\
\ \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.03058805297427065\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n\
\ \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.6041666666666666,\n\
\ \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077615,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077615\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.032671518489247764,\n\
\ \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.032671518489247764\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.02494236893115979,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.02494236893115979\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.042163702135578345,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.042163702135578345\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6548387096774193,\n \"acc_stderr\": 0.027045746573534327,\n \"\
acc_norm\": 0.6548387096774193,\n \"acc_norm_stderr\": 0.027045746573534327\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"\
acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6767676767676768,\n \"acc_stderr\": 0.033322999210706444,\n \"\
acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.033322999210706444\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.03027690994517826,\n\
\ \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.03027690994517826\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4897435897435897,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.4897435897435897,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02784081149587192,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02784081149587192\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5168067226890757,\n \"acc_stderr\": 0.03246013680375308,\n \
\ \"acc_norm\": 0.5168067226890757,\n \"acc_norm_stderr\": 0.03246013680375308\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7155963302752294,\n \"acc_stderr\": 0.019342036587702574,\n \"\
acc_norm\": 0.7155963302752294,\n \"acc_norm_stderr\": 0.019342036587702574\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.32407407407407407,\n \"acc_stderr\": 0.03191923445686185,\n \"\
acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.03191923445686185\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7009803921568627,\n \"acc_stderr\": 0.03213325717373617,\n \"\
acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.03213325717373617\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7130801687763713,\n \"acc_stderr\": 0.029443773022594693,\n \
\ \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.029443773022594693\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467765,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467765\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906276,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906276\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.756066411238825,\n\
\ \"acc_stderr\": 0.015357212665829463,\n \"acc_norm\": 0.756066411238825,\n\
\ \"acc_norm_stderr\": 0.015357212665829463\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.026226158605124655,\n\
\ \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.026226158605124655\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28268156424581004,\n\
\ \"acc_stderr\": 0.015060381730018115,\n \"acc_norm\": 0.28268156424581004,\n\
\ \"acc_norm_stderr\": 0.015060381730018115\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6437908496732027,\n \"acc_stderr\": 0.027420477662629235,\n\
\ \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.027420477662629235\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6077170418006431,\n\
\ \"acc_stderr\": 0.02773125864701199,\n \"acc_norm\": 0.6077170418006431,\n\
\ \"acc_norm_stderr\": 0.02773125864701199\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.026571483480719974,\n\
\ \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.026571483480719974\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.0293922365846125,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.0293922365846125\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4028683181225554,\n\
\ \"acc_stderr\": 0.012526955577118014,\n \"acc_norm\": 0.4028683181225554,\n\
\ \"acc_norm_stderr\": 0.012526955577118014\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.48161764705882354,\n \"acc_stderr\": 0.03035230339535196,\n\
\ \"acc_norm\": 0.48161764705882354,\n \"acc_norm_stderr\": 0.03035230339535196\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5898692810457516,\n \"acc_stderr\": 0.019898412717635903,\n \
\ \"acc_norm\": 0.5898692810457516,\n \"acc_norm_stderr\": 0.019898412717635903\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623326,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623326\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.0389136449583582,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.0389136449583582\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31946144430844553,\n\
\ \"mc1_stderr\": 0.0163226441829605,\n \"mc2\": 0.47211906040617,\n\
\ \"mc2_stderr\": 0.015511830880546584\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.744277821625888,\n \"acc_stderr\": 0.012261253845440473\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.36087945413191813,\n \
\ \"acc_stderr\": 0.013228626753925136\n }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Venomia-1.1-m7
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|arc:challenge|25_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|gsm8k|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hellaswag|10_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T06-19-54.268543.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-12T06-19-54.268543.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- '**/details_harness|winogrande|5_2023-12-12T06-19-54.268543.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-12T06-19-54.268543.parquet'
- config_name: results
data_files:
- split: 2023_12_12T06_19_54.268543
path:
- results_2023-12-12T06-19-54.268543.parquet
- split: latest
path:
- results_2023-12-12T06-19-54.268543.parquet
---
# Dataset Card for Evaluation run of Sao10K/Venomia-1.1-m7
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Sao10K/Venomia-1.1-m7](https://huggingface.co/Sao10K/Venomia-1.1-m7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Venomia-1.1-m7",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-12T06:19:54.268543](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Venomia-1.1-m7/blob/main/results_2023-12-12T06-19-54.268543.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5646939992249473,
"acc_stderr": 0.03365626984219371,
"acc_norm": 0.5687314107018308,
"acc_norm_stderr": 0.03434566057163248,
"mc1": 0.31946144430844553,
"mc1_stderr": 0.0163226441829605,
"mc2": 0.47211906040617,
"mc2_stderr": 0.015511830880546584
},
"harness|arc:challenge|25": {
"acc": 0.5563139931740614,
"acc_stderr": 0.014518421825670442,
"acc_norm": 0.5844709897610921,
"acc_norm_stderr": 0.014401366641216384
},
"harness|hellaswag|10": {
"acc": 0.6445927106154152,
"acc_stderr": 0.004776583530909567,
"acc_norm": 0.8304122684724159,
"acc_norm_stderr": 0.0037450326672282797
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5547169811320755,
"acc_stderr": 0.03058805297427065,
"acc_norm": 0.5547169811320755,
"acc_norm_stderr": 0.03058805297427065
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6041666666666666,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.6041666666666666,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077615,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077615
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4851063829787234,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.4851063829787234,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.02494236893115979,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.02494236893115979
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.042163702135578345,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.042163702135578345
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.027045746573534327,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.027045746573534327
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.033322999210706444,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.033322999210706444
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.03027690994517826,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.03027690994517826
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4897435897435897,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.4897435897435897,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02784081149587192,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02784081149587192
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5168067226890757,
"acc_stderr": 0.03246013680375308,
"acc_norm": 0.5168067226890757,
"acc_norm_stderr": 0.03246013680375308
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7155963302752294,
"acc_stderr": 0.019342036587702574,
"acc_norm": 0.7155963302752294,
"acc_norm_stderr": 0.019342036587702574
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.03191923445686185,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.03191923445686185
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.03213325717373617,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.03213325717373617
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7130801687763713,
"acc_stderr": 0.029443773022594693,
"acc_norm": 0.7130801687763713,
"acc_norm_stderr": 0.029443773022594693
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906276,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906276
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.756066411238825,
"acc_stderr": 0.015357212665829463,
"acc_norm": 0.756066411238825,
"acc_norm_stderr": 0.015357212665829463
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.026226158605124655,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.026226158605124655
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28268156424581004,
"acc_stderr": 0.015060381730018115,
"acc_norm": 0.28268156424581004,
"acc_norm_stderr": 0.015060381730018115
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.027420477662629235,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.027420477662629235
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6077170418006431,
"acc_stderr": 0.02773125864701199,
"acc_norm": 0.6077170418006431,
"acc_norm_stderr": 0.02773125864701199
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.026571483480719974,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.026571483480719974
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.0293922365846125,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.0293922365846125
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4028683181225554,
"acc_stderr": 0.012526955577118014,
"acc_norm": 0.4028683181225554,
"acc_norm_stderr": 0.012526955577118014
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.48161764705882354,
"acc_stderr": 0.03035230339535196,
"acc_norm": 0.48161764705882354,
"acc_norm_stderr": 0.03035230339535196
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5898692810457516,
"acc_stderr": 0.019898412717635903,
"acc_norm": 0.5898692810457516,
"acc_norm_stderr": 0.019898412717635903
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623326,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623326
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.0389136449583582,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.0389136449583582
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31946144430844553,
"mc1_stderr": 0.0163226441829605,
"mc2": 0.47211906040617,
"mc2_stderr": 0.015511830880546584
},
"harness|winogrande|5": {
"acc": 0.744277821625888,
"acc_stderr": 0.012261253845440473
},
"harness|gsm8k|5": {
"acc": 0.36087945413191813,
"acc_stderr": 0.013228626753925136
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ssbuild/alpaca_gpt4 | ---
license: apache-2.0
---
|
autoevaluate/autoeval-eval-cnn_dailymail-3.0.0-bf74a8-1524255094 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: SamuelAllen123/t5-efficient-large-nl36_fine_tune_sum_V2
metrics: []
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: SamuelAllen123/t5-efficient-large-nl36_fine_tune_sum_V2
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@samuelallen123](https://huggingface.co/samuelallen123) for evaluating this model. |
ericyu/LEVIRCD_Cropped256 | ---
dataset_info:
features:
- name: imageA
dtype: image
- name: imageB
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 287118335.44
num_examples: 7120
- name: test
num_bytes: 73188109.824
num_examples: 2048
- name: val
num_bytes: 34384403.584
num_examples: 1024
download_size: 345121409
dataset_size: 394690848.848
---
# Dataset Card for "LEVIRCD_Cropped256"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DialogueCharacter/chinese_instinwild_unfiltered | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: response
sequence: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 30197794
num_examples: 51504
download_size: 17704859
dataset_size: 30197794
---
# Dataset Card for "chinese_instinwild_unfiltered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nbalepur/KARL | ---
dataset_info:
features:
- name: user_id
dtype: string
- name: card_id
dtype: string
- name: card_text
dtype: string
- name: is_new_fact
dtype: bool
- name: user_n_study_positive
dtype: int64
- name: user_n_study_negative
dtype: int64
- name: user_n_study_total
dtype: int64
- name: card_n_study_positive
dtype: int64
- name: card_n_study_negative
dtype: int64
- name: card_n_study_total
dtype: int64
- name: usercard_n_study_positive
dtype: int64
- name: usercard_n_study_negative
dtype: int64
- name: usercard_n_study_total
dtype: int64
- name: acc_user
dtype: float64
- name: acc_card
dtype: float64
- name: acc_usercard
dtype: float64
- name: usercard_delta
dtype: int64
- name: usercard_delta_previous
dtype: int64
- name: usercard_previous_study_response
dtype: bool
- name: leitner_box
dtype: int64
- name: sm2_efactor
dtype: float64
- name: sm2_interval
dtype: float64
- name: sm2_repetition
dtype: int64
- name: delta_to_leitner_scheduled_date
dtype: int64
- name: delta_to_sm2_scheduled_date
dtype: int64
- name: repetition_model
dtype: string
- name: elapsed_milliseconds
dtype: int64
- name: correct_on_first_try
dtype: bool
- name: utc_datetime
dtype: timestamp[ns, tz=UTC]
- name: utc_date
dtype: date32
- name: response
dtype: bool
- name: record_id
dtype: string
- name: deck_id
dtype: string
- name: deck_name
dtype: string
- name: n_minutes_spent
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 50145895
num_examples: 123143
download_size: 14348387
dataset_size: 50145895
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
javito2003/hfexample | ---
license: mit
---
|
ACT8113/Mysta_Rias | ---
license: openrail
---
|
3ee/regularization-castle | ---
license: mit
tags:
- stable-diffusion
- regularization-images
- text-to-image
- image-to-image
- dreambooth
- class-instance
- preservation-loss-training
---
# Castle Regularization Images
A collection of regularization & class instance datasets of castles for the Stable Diffusion 1.5 to use for DreamBooth prior preservation loss training. |
Heba30018/ChestX_Ray | ---
dataset_info:
features:
- name: image
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 239376
num_examples: 6469
download_size: 77729
dataset_size: 239376
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/tartu_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tartu/タルテュ/塔尔图 (Azur Lane)
This is the dataset of tartu/タルテュ/塔尔图 (Azur Lane), containing 50 images and their tags.
The core tags of this character are `braid, breasts, pink_hair, short_hair, braided_bangs, streaked_hair, yellow_eyes, multicolored_hair, large_breasts, ahoge, earrings, blue_hair, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 50 | 70.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tartu_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 50 | 39.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tartu_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 133 | 88.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tartu_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 50 | 62.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tartu_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 133 | 128.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tartu_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tartu_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | bare_shoulders, 1girl, cross_necklace, cleavage, solo, white_dress, looking_at_viewer, french_flag, choker, white_thighhighs, fingerless_gloves, black_gloves, detached_sleeves, simple_background, white_background, collarbone |
| 1 | 19 |  |  |  |  |  | 1girl, solo, bare_shoulders, blue_bikini, looking_at_viewer, blush, hair_flower, sun_hat, white_headwear, day, outdoors, cleavage, popsicle, beachball, holding, sitting, sky, thigh_strap, ass, collarbone, bracelet, frilled_bikini |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | bare_shoulders | 1girl | cross_necklace | cleavage | solo | white_dress | looking_at_viewer | french_flag | choker | white_thighhighs | fingerless_gloves | black_gloves | detached_sleeves | simple_background | white_background | collarbone | blue_bikini | blush | hair_flower | sun_hat | white_headwear | day | outdoors | popsicle | beachball | holding | sitting | sky | thigh_strap | ass | bracelet | frilled_bikini |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------|:--------|:-----------------|:-----------|:-------|:--------------|:--------------------|:--------------|:---------|:-------------------|:--------------------|:---------------|:-------------------|:--------------------|:-------------------|:-------------|:--------------|:--------|:--------------|:----------|:-----------------|:------|:-----------|:-----------|:------------|:----------|:----------|:------|:--------------|:------|:-----------|:-----------------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 19 |  |  |  |  |  | X | X | | X | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
BangumiBase/imocho | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Recently, My Sister Is Unusual
This is the image base of bangumi Recently, My Sister Is Unusual, we detected 63 characters, 5226 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 300 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 35 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 18 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 17 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 64 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 20 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 13 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 14 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 14 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 16 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 19 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 1031 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 15 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 18 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 14 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 8 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 1237 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 31 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 171 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 32 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 108 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 78 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 28 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 45 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 23 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 11 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 14 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 18 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 13 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 18 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 18 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 25 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 24 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 121 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 25 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 22 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 18 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 247 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 10 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 23 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 21 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 10 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 8 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 10 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 10 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 9 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 11 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 220 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 38 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 504 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 22 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 29 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 75 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 9 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 60 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 14 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 8 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 85 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 11 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 7 | [Download](59/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 60 | 9 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 14 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 66 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
crbpfx/KWX | ---
license: cc-by-4.0
task_categories:
- text-generation
language:
- en
tags:
- keywords
- scholarly articles
pretty_name: 'KWX: scholarly articles with keywords from arXiv'
size_categories:
- 10K<n<100K
--- |
ibranze/araproje_mmlu_tr_conf_gpt2_nearestscore_true | ---
dataset_info:
features:
- name: question
dtype: string
- name: subject
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: validation
num_bytes: 137404.0
num_examples: 250
download_size: 83939
dataset_size: 137404.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_mmlu_tr_conf_gpt2_nearestscore_true"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mespinosami/elephants | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': eles
'1': no_eles
splits:
- name: train
num_bytes: 14514122908.549025
num_examples: 159283
- name: test
num_bytes: 3662606794.077976
num_examples: 39821
download_size: 17959333418
dataset_size: 18176729702.627
---
# Dataset Card for "elephants"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
simar95/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966692
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
arbml/sudanese_dialect_speech | ---
dataset_info:
features:
- name: text
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 1207318008.52
num_examples: 3547
download_size: 1624404468
dataset_size: 1207318008.52
---
# Dataset Card for "sudanese_dialect_speech"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ademax/ocr_en | ---
dataset_info:
features:
- name: text
dtype: string
- name: image
dtype: image
- name: meta
struct:
- name: path
dtype: string
- name: subset
dtype: string
splits:
- name: train
num_bytes: 11297948080.151525
num_examples: 200000
download_size: 14723118289
dataset_size: 11297948080.151525
---
# Dataset Card for "ocr_en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zobnec/controlnet_fs_fetch | ---
dataset_info:
features:
- name: conditioning
dtype: image
- name: samples
dtype: image
- name: reconstruction
dtype: image
- name: control
dtype: image
splits:
- name: train
num_bytes: 1061432873.487
num_examples: 1317
download_size: 1058860880
dataset_size: 1061432873.487
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "controlnet_fs_fetch"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mtc/multirc_sample_questions_merged | ---
dataset_info:
features:
- name: document
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 75446
num_examples: 222
download_size: 38183
dataset_size: 75446
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
littlebird13/mydatasets | ---
license: apache-2.0
---
|
Hikari0608/Eval | ---
dataset_info:
features:
- name: raw
dtype: image
splits:
- name: UIEB
num_bytes: 31221694.0
num_examples: 60
download_size: 31218498
dataset_size: 31221694.0
configs:
- config_name: default
data_files:
- split: UIEB
path: data/UIEB-*
---
# Dataset Card for "Eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jondewoo/postapoc-cubism | ---
license: cc0-1.0
---
|
iamvastava/underwater_image_enhancement | ---
license: apache-2.0
---
|
Una713/description2Image | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 9546659.0
num_examples: 335
download_size: 5446356
dataset_size: 9546659.0
---
# Dataset Card for "description2Image"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Seanxh/twitter_dataset_1713069560 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 90553
num_examples: 200
download_size: 24808
dataset_size: 90553
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
james-burton/OrientalMuseum_min6-3Dwhite-mat | ---
dataset_info:
features:
- name: obj_num
dtype: string
- name: file
dtype: string
- name: image
dtype: image
- name: root
dtype: string
- name: description
dtype: string
- name: object_name
dtype: string
- name: other_name
dtype: string
- name: label
dtype:
class_label:
names:
'0': Animal Mummy
'1': Batik
'2': Buffalo Horn
'3': Chinese Red Rosewood
'4': Colour on Paper
'5': Flint/Chert
'6': Gouache on Paper
'7': Haematite/Red Ochre
'8': Human Bone
'9': Ink and Colour on Paper
'10': Ink and Colours on Silk
'11': Ink and Opaque Watercolour on Paper
'12': Ink on Paper
'13': Jade (Calcified)
'14': Japanese paper
'15': Microcline/Green Feldspar/Amazon-Stone
'16': Nile Mud
'17': Opaque Watercolour on Paper
'18': Opaque Watercolour or Gouache on Mica
'19': Pith
'20': Pith Paper
'21': Plant Product
'22': Resin/Plastic
'23': Rhinoceros Horn
'24': Smaragdite
'25': Steatite
'26': Steatite/Soap Stone
'27': Watercolour on Rice Paper
'28': acrylic
'29': agate
'30': alabaster
'31': aluminum
'32': amber
'33': amethyst
'34': antler
'35': artificial stone
'36': bamboo
'37': basalt
'38': bone
'39': bowenite
'40': boxwood
'41': brass
'42': brocade
'43': bronze
'44': burnt jade
'45': canvas
'46': cardboard
'47': cards
'48': carnelian
'49': cast iron
'50': celadon
'51': cellulose acetate
'52': ceramic
'53': chalcedony
'54': cherry
'55': clay
'56': coconut
'57': copper
'58': copper alloy
'59': coral
'60': cotton
'61': crystal
'62': diorite
'63': earthenware
'64': ebony
'65': emerald
'66': enamel
'67': faience
'68': flax
'69': flint
'70': gauze
'71': glass
'72': gold
'73': granite
'74': gray ware
'75': hardwood
'76': horn
'77': incense
'78': ink
'79': iron
'80': ivory
'81': jade
'82': jadeite
'83': jasper
'84': lacquer
'85': lapis lazuli
'86': lead
'87': lead alloy
'88': leather
'89': limestone
'90': linen
'91': malachite
'92': marble
'93': metal
'94': mineral
'95': mother of pearl
'96': nephrite
'97': nylon
'98': obsidian
'99': organic material
'100': paint
'101': palm fiber
'102': palm leaf
'103': paper
'104': papier mâché
'105': papyrus
'106': pewter
'107': photographic paper
'108': pine
'109': plant fiber
'110': plaster
'111': plastic
'112': plate
'113': polyester
'114': polystyrene
'115': porcelain
'116': pottery
'117': quartzite
'118': rattan
'119': realgar
'120': reed
'121': rice paper
'122': rock
'123': rush
'124': sandstone
'125': satin
'126': schist
'127': seashell
'128': serpentine
'129': shell
'130': silk
'131': silver
'132': slate
'133': soapstone
'134': softwood
'135': steel
'136': stone
'137': stoneware
'138': stucco
'139': sycamore
'140': synthetic fiber
'141': teak
'142': terracotta
'143': textiles
'144': tin
'145': tortoise shell
'146': travertine
'147': tremolite
'148': turquoise
'149': velvet
'150': wood
'151': wool
'152': wrought iron
'153': zinc alloy
- name: production.period
dtype: string
- name: production.place
dtype: string
splits:
- name: validation
num_bytes: 676051498.288
num_examples: 5416
- name: test
num_bytes: 654475118.192
num_examples: 5416
- name: train
num_bytes: 5401733376.12
num_examples: 115070
download_size: 6301924880
dataset_size: 6732259992.6
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: test
path: data/test-*
- split: train
path: data/train-*
---
|
ostapeno/sharegpt | ---
dataset_info:
features:
- name: dataset
dtype: string
- name: id
dtype: string
- name: messages
list:
- name: role
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 815707764
num_examples: 168864
download_size: 347091152
dataset_size: 815707764
---
# Dataset Card for "sharegpt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
paul-w-qs/handling_charges_v2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: JSON_LABEL
dtype: string
splits:
- name: train
num_bytes: 493878408.0
num_examples: 1000
download_size: 491831352
dataset_size: 493878408.0
---
# Dataset Card for "handling_charges_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T03:26:24.700404](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down/blob/main/results_2023-10-28T03-26-24.700404.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3543414429530201,\n\
\ \"em_stderr\": 0.004898375582961981,\n \"f1\": 0.39350461409396,\n\
\ \"f1_stderr\": 0.004814089237052805,\n \"acc\": 0.4362902583993399,\n\
\ \"acc_stderr\": 0.01014740888692861\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3543414429530201,\n \"em_stderr\": 0.004898375582961981,\n\
\ \"f1\": 0.39350461409396,\n \"f1_stderr\": 0.004814089237052805\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10462471569370735,\n \
\ \"acc_stderr\": 0.008430668082029278\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.011864149691827943\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|arc:challenge|25_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T03_26_24.700404
path:
- '**/details_harness|drop|3_2023-10-28T03-26-24.700404.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T03-26-24.700404.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T03_26_24.700404
path:
- '**/details_harness|gsm8k|5_2023-10-28T03-26-24.700404.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T03-26-24.700404.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hellaswag|10_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-10-57.019261.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T05-10-57.019261.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T05-10-57.019261.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T03_26_24.700404
path:
- '**/details_harness|winogrande|5_2023-10-28T03-26-24.700404.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T03-26-24.700404.parquet'
- config_name: results
data_files:
- split: 2023_10_04T05_10_57.019261
path:
- results_2023-10-04T05-10-57.019261.parquet
- split: 2023_10_28T03_26_24.700404
path:
- results_2023-10-28T03-26-24.700404.parquet
- split: latest
path:
- results_2023-10-28T03-26-24.700404.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T03:26:24.700404](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE4_3.8w-r4-q_k_v_o_gate_up_down/blob/main/results_2023-10-28T03-26-24.700404.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3543414429530201,
"em_stderr": 0.004898375582961981,
"f1": 0.39350461409396,
"f1_stderr": 0.004814089237052805,
"acc": 0.4362902583993399,
"acc_stderr": 0.01014740888692861
},
"harness|drop|3": {
"em": 0.3543414429530201,
"em_stderr": 0.004898375582961981,
"f1": 0.39350461409396,
"f1_stderr": 0.004814089237052805
},
"harness|gsm8k|5": {
"acc": 0.10462471569370735,
"acc_stderr": 0.008430668082029278
},
"harness|winogrande|5": {
"acc": 0.7679558011049724,
"acc_stderr": 0.011864149691827943
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
timothyckl/tiny-ultrachat-uncensored | ---
dataset_info:
features:
- name: data
sequence: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 880539616
num_examples: 175245
download_size: 453661628
dataset_size: 880539616
---
# Dataset Card for "tiny-ultrachat-uncensored"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_SanjiWatsuki__Lelantos-7B | ---
pretty_name: Evaluation run of SanjiWatsuki/Lelantos-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SanjiWatsuki/Lelantos-7B](https://huggingface.co/SanjiWatsuki/Lelantos-7B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SanjiWatsuki__Lelantos-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-04T11:51:45.289850](https://huggingface.co/datasets/open-llm-leaderboard/details_SanjiWatsuki__Lelantos-7B/blob/main/results_2024-01-04T11-51-45.289850.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6459625636151246,\n\
\ \"acc_stderr\": 0.03219655588014755,\n \"acc_norm\": 0.6456793887410595,\n\
\ \"acc_norm_stderr\": 0.03286123598105045,\n \"mc1\": 0.4773561811505508,\n\
\ \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.6518191080402123,\n\
\ \"mc2_stderr\": 0.015251041012498917\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6621160409556314,\n \"acc_stderr\": 0.013822047922283512,\n\
\ \"acc_norm\": 0.6902730375426621,\n \"acc_norm_stderr\": 0.013512058415238363\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6910973909579765,\n\
\ \"acc_stderr\": 0.004610966122378294,\n \"acc_norm\": 0.8690499900418244,\n\
\ \"acc_norm_stderr\": 0.0033665623627415564\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566018,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566018\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.02328766512726855,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.02328766512726855\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6435897435897436,\n \"acc_stderr\": 0.024283140529467305,\n\
\ \"acc_norm\": 0.6435897435897436,\n \"acc_norm_stderr\": 0.024283140529467305\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524565,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524565\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624734,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624734\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577615,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577615\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40670391061452515,\n\
\ \"acc_stderr\": 0.016428811915898865,\n \"acc_norm\": 0.40670391061452515,\n\
\ \"acc_norm_stderr\": 0.016428811915898865\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.025403832978179615,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.025403832978179615\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n\
\ \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n\
\ \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.027962677604768917,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.027962677604768917\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4773561811505508,\n\
\ \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.6518191080402123,\n\
\ \"mc2_stderr\": 0.015251041012498917\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8066298342541437,\n \"acc_stderr\": 0.011099796645920533\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7081122062168309,\n \
\ \"acc_stderr\": 0.012522795894420867\n }\n}\n```"
repo_url: https://huggingface.co/SanjiWatsuki/Lelantos-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|arc:challenge|25_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|gsm8k|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hellaswag|10_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T11-51-45.289850.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T11-51-45.289850.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- '**/details_harness|winogrande|5_2024-01-04T11-51-45.289850.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-04T11-51-45.289850.parquet'
- config_name: results
data_files:
- split: 2024_01_04T11_51_45.289850
path:
- results_2024-01-04T11-51-45.289850.parquet
- split: latest
path:
- results_2024-01-04T11-51-45.289850.parquet
---
# Dataset Card for Evaluation run of SanjiWatsuki/Lelantos-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SanjiWatsuki/Lelantos-7B](https://huggingface.co/SanjiWatsuki/Lelantos-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SanjiWatsuki__Lelantos-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-04T11:51:45.289850](https://huggingface.co/datasets/open-llm-leaderboard/details_SanjiWatsuki__Lelantos-7B/blob/main/results_2024-01-04T11-51-45.289850.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6459625636151246,
"acc_stderr": 0.03219655588014755,
"acc_norm": 0.6456793887410595,
"acc_norm_stderr": 0.03286123598105045,
"mc1": 0.4773561811505508,
"mc1_stderr": 0.01748554225848965,
"mc2": 0.6518191080402123,
"mc2_stderr": 0.015251041012498917
},
"harness|arc:challenge|25": {
"acc": 0.6621160409556314,
"acc_stderr": 0.013822047922283512,
"acc_norm": 0.6902730375426621,
"acc_norm_stderr": 0.013512058415238363
},
"harness|hellaswag|10": {
"acc": 0.6910973909579765,
"acc_stderr": 0.004610966122378294,
"acc_norm": 0.8690499900418244,
"acc_norm_stderr": 0.0033665623627415564
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566018,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566018
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726855,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6435897435897436,
"acc_stderr": 0.024283140529467305,
"acc_norm": 0.6435897435897436,
"acc_norm_stderr": 0.024283140529467305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524565,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524565
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461763,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624734,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624734
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577615,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577615
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40670391061452515,
"acc_stderr": 0.016428811915898865,
"acc_norm": 0.40670391061452515,
"acc_norm_stderr": 0.016428811915898865
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179615,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179615
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170598,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170598
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768917,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768917
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4773561811505508,
"mc1_stderr": 0.01748554225848965,
"mc2": 0.6518191080402123,
"mc2_stderr": 0.015251041012498917
},
"harness|winogrande|5": {
"acc": 0.8066298342541437,
"acc_stderr": 0.011099796645920533
},
"harness|gsm8k|5": {
"acc": 0.7081122062168309,
"acc_stderr": 0.012522795894420867
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lawful-good-project/ipc_decisions_4k_1024 | ---
license: gpl-3.0
task_categories:
- text-generation
language:
- ru
tags:
- legal
---
Датасет судебных решений суда по интеллектуальным правам РФ со строками до 1024 символов и синтаксисом для дообучения с инструкциями. |
gguichard/wsd_myriade_synth_data_gpt4turbo_val_sansdef | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: wn_sens
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 466852
num_examples: 676
download_size: 111646
dataset_size: 466852
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wsd_myriade_synth_data_gpt4turbo_val_sansdef"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1712937265 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 17659
num_examples: 42
download_size: 12256
dataset_size: 17659
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712937265"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sachith-surge/LaMini-instruction-only-SequenceMatcher-Levenstein | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: source
dtype: string
- name: similarity_sequencematcher
dtype: float64
- name: most_similar_example_sequencematcher
dtype: string
- name: similarity_edit
dtype: int64
splits:
- name: train
num_bytes: 79185.40272028386
num_examples: 186
download_size: 56949
dataset_size: 79185.40272028386
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "LaMini-LM-filtered-instruction-only-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nafiturgut/inpaint-finetune | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1608841.0
num_examples: 9
download_size: 1610882
dataset_size: 1608841.0
---
# Dataset Card for "inpaint-finetune"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/random25eof_find_passage_train10000_eval1000_rare | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 2174452
num_examples: 21000
- name: validation
num_bytes: 118222
num_examples: 1000
download_size: 790893
dataset_size: 2292674
---
# Dataset Card for "random25eof_find_passage_train10000_eval1000_rare"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
myradeng/diffusion_db_5k_train_v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
- name: seed
dtype: uint32
- name: step
dtype: uint16
- name: cfg
dtype: float32
- name: sampler
dtype: string
- name: width
dtype: uint16
- name: height
dtype: uint16
- name: user_name
dtype: string
- name: timestamp
dtype: timestamp[us, tz=UTC]
- name: image_nsfw
dtype: float32
- name: prompt_nsfw
dtype: float32
splits:
- name: train
num_bytes: 1817027835.2
num_examples: 4000
download_size: 1800963916
dataset_size: 1817027835.2
---
# Dataset Card for "diffusion_db_5k_train_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AndyLiu0104/Soldering-Data-Tiny | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2211549.5
num_examples: 1412
download_size: 1354207
dataset_size: 2211549.5
---
# Dataset Card for "Soldering-Data-Tiny"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
garutyunov/litbank-entities | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence: string
- name: ner_spans
sequence:
sequence: int32
splits:
- name: train
num_bytes: 6575841
num_examples: 100
download_size: 2042117
dataset_size: 6575841
---
# Dataset Card for "litbank-entities"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AppleHarem/charles_ausburne_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of charles_ausburne (Azur Lane)
This is the dataset of charles_ausburne (Azur Lane), containing 12 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 12 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 30 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 31 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 12 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 12 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 12 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 30 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 30 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 12 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 31 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 31 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
icantiemyshoe/cve-to-metasploit-module | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: source
dtype: string
- name: cve
dtype: string
- name: script_type
dtype: string
# splits:
# - name: train
# num_bytes: 290000000
# num_examples: 4278
# download_size: 290000000
# dataset_size: 290000000
license: bsd-2-clause
language:
- en
size_categories:
- 1K<n<10K
---
# CVE To Metasploit Module Prompt
This dataset is a submodule to the overall project to create an LLM that can look at newly published CVE writeups and create metasploit modules. The main repo for the project can be found [here](https://github.com/roostercoopllc/metAIsploit-assistant).
## Usage
*TO-DO*
## References
*TO-DO*
|
Akram-AI/flipkarts | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Baby Care
'1': Beauty and Personal Care
'2': Computers
'3': Home Decor & Festive Needs
'4': Home Furnishing
'5': Kitchen & Dining
'6': Watches
splits:
- name: train
num_bytes: 313641841.13714284
num_examples: 892
- name: test
num_bytes: 52251354.86285714
num_examples: 158
download_size: 361310413
dataset_size: 365893196.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
LRGB/peptides-structural | ---
task_categories:
- graph-ml
size_categories:
- 1M<n<10M
tags:
- lrgb
license: cc-by-nc-4.0
---
# `peptides-functional`
### Dataset Summary
| Dataset | Domain | Task | Node Feat. (dim) | Edge Feat. (dim) | Perf. Metric |
|---|---|---|---|---|---|
| Peptides-struct | Chemistry | Graph Regression | Atom Encoder (9) | Bond Encoder (3) | MAE |
| Dataset | # Graphs | # Nodes | μ Nodes | μ Deg. | # Edges | μ Edges | μ Short. Path | μ Diameter
|---|---:|---:|---:|:---:|---:|---:|---:|---:|
| Peptides-struct | 15,535 | 2,344,859 | 150.94 | 2.04 | 4,773,974 | 307.30 | 20.89±9.79 | 56.99±28.72 |
## Additional Information
### Dataset Curators
* Vijay Prakash Dwivedi ([vijaydwivedi75](https://github.com/vijaydwivedi75))
### Citation Information
```
@article{dwivedi2022LRGB,
title={Long Range Graph Benchmark},
author={Dwivedi, Vijay Prakash and Rampášek, Ladislav and Galkin, Mikhail and Parviz, Ali and Wolf, Guy and Luu, Anh Tuan and Beaini, Dominique},
journal={arXiv:2206.08164},
year={2022}
}
``` |
eswardivi/indiccorp_telugu_line | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 15889353299
num_examples: 42916522
download_size: 6566759636
dataset_size: 15889353299
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
search_qa | ---
annotations_creators:
- found
language:
- en
language_creators:
- found
license:
- unknown
multilinguality:
- monolingual
pretty_name: SearchQA
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- extractive-qa
paperswithcode_id: searchqa
dataset_info:
- config_name: raw_jeopardy
features:
- name: category
dtype: string
- name: air_date
dtype: string
- name: question
dtype: string
- name: value
dtype: string
- name: answer
dtype: string
- name: round
dtype: string
- name: show_number
dtype: int32
- name: search_results
sequence:
- name: urls
dtype: string
- name: snippets
dtype: string
- name: titles
dtype: string
- name: related_links
dtype: string
splits:
- name: train
num_bytes: 7770972348
num_examples: 216757
download_size: 3314386157
dataset_size: 7770972348
- config_name: train_test_val
features:
- name: category
dtype: string
- name: air_date
dtype: string
- name: question
dtype: string
- name: value
dtype: string
- name: answer
dtype: string
- name: round
dtype: string
- name: show_number
dtype: int32
- name: search_results
sequence:
- name: urls
dtype: string
- name: snippets
dtype: string
- name: titles
dtype: string
- name: related_links
dtype: string
splits:
- name: train
num_bytes: 5303005740
num_examples: 151295
- name: test
num_bytes: 1466749978
num_examples: 43228
- name: validation
num_bytes: 740962715
num_examples: 21613
download_size: 3148550732
dataset_size: 7510718433
---
# Dataset Card for "search_qa"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** https://github.com/nyu-dl/dl4ir-searchQA
- **Paper:** [SearchQA: A New Q&A Dataset Augmented with Context from a Search Engine](https://arxiv.org/abs/1704.05179)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 6.46 GB
- **Size of the generated dataset:** 15.28 GB
- **Total amount of disk used:** 21.74 GB
### Dataset Summary
We publicly release a new large-scale dataset, called SearchQA, for machine comprehension, or question-answering. Unlike recently released datasets, such as DeepMind
CNN/DailyMail and SQuAD, the proposed SearchQA was constructed to reflect a full pipeline of general question-answering. That is, we start not from an existing article
and generate a question-answer pair, but start from an existing question-answer pair, crawled from J! Archive, and augment it with text snippets retrieved by Google.
Following this approach, we built SearchQA, which consists of more than 140k question-answer pairs with each pair having 49.6 snippets on average. Each question-answer-context
tuple of the SearchQA comes with additional meta-data such as the snippet's URL, which we believe will be valuable resources for future research. We conduct human evaluation
as well as test two baseline methods, one simple word selection and the other deep learning based, on the SearchQA. We show that there is a meaningful gap between the human
and machine performances. This suggests that the proposed dataset could well serve as a benchmark for question-answering.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### raw_jeopardy
- **Size of downloaded dataset files:** 3.31 GB
- **Size of the generated dataset:** 7.77 GB
- **Total amount of disk used:** 11.09 GB
An example of 'train' looks as follows.
```
```
#### train_test_val
- **Size of downloaded dataset files:** 3.15 GB
- **Size of the generated dataset:** 7.51 GB
- **Total amount of disk used:** 10.66 GB
An example of 'validation' looks as follows.
```
```
### Data Fields
The data fields are the same among all splits.
#### raw_jeopardy
- `category`: a `string` feature.
- `air_date`: a `string` feature.
- `question`: a `string` feature.
- `value`: a `string` feature.
- `answer`: a `string` feature.
- `round`: a `string` feature.
- `show_number`: a `int32` feature.
- `search_results`: a dictionary feature containing:
- `urls`: a `string` feature.
- `snippets`: a `string` feature.
- `titles`: a `string` feature.
- `related_links`: a `string` feature.
#### train_test_val
- `category`: a `string` feature.
- `air_date`: a `string` feature.
- `question`: a `string` feature.
- `value`: a `string` feature.
- `answer`: a `string` feature.
- `round`: a `string` feature.
- `show_number`: a `int32` feature.
- `search_results`: a dictionary feature containing:
- `urls`: a `string` feature.
- `snippets`: a `string` feature.
- `titles`: a `string` feature.
- `related_links`: a `string` feature.
### Data Splits
#### raw_jeopardy
| |train |
|------------|-----:|
|raw_jeopardy|216757|
#### train_test_val
| |train |validation|test |
|--------------|-----:|---------:|----:|
|train_test_val|151295| 21613|43228|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{DBLP:journals/corr/DunnSHGCC17,
author = {Matthew Dunn and
Levent Sagun and
Mike Higgins and
V. Ugur G{"{u}}ney and
Volkan Cirik and
Kyunghyun Cho},
title = {SearchQA: {A} New Q{\&}A Dataset Augmented with Context from a
Search Engine},
journal = {CoRR},
volume = {abs/1704.05179},
year = {2017},
url = {http://arxiv.org/abs/1704.05179},
archivePrefix = {arXiv},
eprint = {1704.05179},
timestamp = {Mon, 13 Aug 2018 16:47:09 +0200},
biburl = {https://dblp.org/rec/journals/corr/DunnSHGCC17.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
### Contributions
Thanks to [@lewtun](https://github.com/lewtun), [@mariamabarham](https://github.com/mariamabarham), [@lhoestq](https://github.com/lhoestq), [@thomwolf](https://github.com/thomwolf) for adding this dataset. |
RandomThinker42/FoundersArchives | ---
license: cc-by-nc-4.0
dataset_info:
features:
- name: title
dtype: string
- name: permalink
dtype: string
- name: project
dtype: string
- name: authors
dtype: string
- name: recipients
dtype: string
- name: date-from
dtype: timestamp[s]
- name: date-to
dtype: timestamp[s]
- name: content
dtype: string
splits:
- name: train
num_bytes: 452725661
num_examples: 181586
download_size: 241059447
dataset_size: 452725661
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
This dataset is a simple collection of 182,079 documents found in the [Founder's Online Metadata](https://founders.archives.gov/Metadata/founders-online-metadata.json)
with the matching content from the api.
This was just a quick weekend project idea and so I haven't spent much time on it. It should not be used without cleanup.
I have no affiliation with [NHPRC](https://www.archives.gov/nhprc) or the [University of Virginia Press](https://www.upress.virginia.edu),
I merely collected the data for my own personal interest.
## Quick notes
I had never attempted making my own HF datasets before so I went through a few attempts to upload and I am quite sure I did it wrong.
I did attempt an initial fine-tuning run using it as is anyway, which took a few weeks, but it wasn't worth it and I lack the time to do it right.
The intent was to be cleaned up and structured for training or finetuning a model, but have not had the time.
## Problem Documents
The data currently includes some sort of composite entries that need to be removed,
such as duplicative monthly journal entries, where there exists a document for each day already.
These composite documents exist in the original API, so I had no way to filter them in this initial pass.
Some digging has me concerned that some of those composite entries may also include copyrighted notes not available under the CC use,
so I would be wary of training before they are removed anyway.
There are also many documents that are likely not suitable as is for training,
where they include short hand and incomplete notes from journals, and might be saner to remove them.
Most documents also have very inconsistant spacing indententation, and line breaks, that likely should be removed depending on use case.
## Missing Documents
This is missing at least 2548 documents from the John Jay collection,
because the API does not yet have those documents.
An additional 281 documents from other collections were automatically excluded because the API returned no content.
# From the [website:](https://www.archives.gov/open/nhprc/dataset-founders-online)
### What is the data?
This dataset provides information about the more than 180,000 documents published as part of [Founders Online](founders.archives.gov).
Founders Online includes the correspondence and writings of John Adams, Benjamin Franklin, Alexander Hamilton, Thomas Jefferson, James Madison and George Washington.
### Is this dataset in the public domain?
As the work of the [University of Virginia Press](https://www.upress.virginia.edu), this data is released for non-commerical use and by attribution.
|
NexaAI/Cosmetics | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 4009237.0
num_examples: 1
download_size: 4010161
dataset_size: 4009237.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Cosmetics"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
j-chim/pii-pile-chunk3-0-50000-tagged | ---
dataset_info:
features:
- name: texts
sequence: string
- name: meta
struct:
- name: pile_set_name
dtype: string
- name: scores
sequence: float64
- name: avg_score
dtype: float64
- name: num_sents
dtype: int64
- name: tagged_pii_results
list:
- name: analysis_explanation
dtype: 'null'
- name: end
dtype: int64
- name: entity_type
dtype: string
- name: recognition_metadata
struct:
- name: recognizer_identifier
dtype: string
- name: recognizer_name
dtype: string
- name: score
dtype: float64
- name: start
dtype: int64
splits:
- name: train
num_bytes: 505187810
num_examples: 50000
download_size: 192707833
dataset_size: 505187810
---
# Dataset Card for "pii-pile-chunk3-0-50000-tagged"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joseluhf11/oct-fovea-detection_v4_truth | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: objects
struct:
- name: bbox
sequence:
sequence: int64
- name: categories
sequence: string
- name: pred_angle
dtype: float64
- name: tag
dtype: string
- name: truth_angle
dtype: float64
splits:
- name: train
num_bytes: 76966356.0
num_examples: 95
download_size: 76970573
dataset_size: 76966356.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
InceptiveDev/new_objective | ---
license: mit
---
|
Terkawy/Networking-Basics-Lecture1 | ---
task_categories:
- table-question-answering
language:
- ar
tags:
- art
pretty_name: Terkawy
size_categories:
- 1K<n<10K
---
this dataset for educational purposes and i do not recomended any one to try this dataset because your computer will burn out after that. |
open-llm-leaderboard/details_lmsys__longchat-7b-v1.5-32k | ---
pretty_name: Evaluation run of lmsys/longchat-7b-v1.5-32k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lmsys/longchat-7b-v1.5-32k](https://huggingface.co/lmsys/longchat-7b-v1.5-32k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lmsys__longchat-7b-v1.5-32k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-16T16:20:33.188247](https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__longchat-7b-v1.5-32k/blob/main/results_2023-10-16T16-20-33.188247.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08252936241610738,\n\
\ \"em_stderr\": 0.0028179934761829416,\n \"f1\": 0.1372829278523486,\n\
\ \"f1_stderr\": 0.0030245592633561815,\n \"acc\": 0.3672124310289838,\n\
\ \"acc_stderr\": 0.009455449816488642\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.08252936241610738,\n \"em_stderr\": 0.0028179934761829416,\n\
\ \"f1\": 0.1372829278523486,\n \"f1_stderr\": 0.0030245592633561815\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.047763457164518575,\n \
\ \"acc_stderr\": 0.005874387536229305\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6866614048934491,\n \"acc_stderr\": 0.01303651209674798\n\
\ }\n}\n```"
repo_url: https://huggingface.co/lmsys/longchat-7b-v1.5-32k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_16T16_20_33.188247
path:
- '**/details_harness|drop|3_2023-10-16T16-20-33.188247.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-16T16-20-33.188247.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_16T16_20_33.188247
path:
- '**/details_harness|gsm8k|5_2023-10-16T16-20-33.188247.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-16T16-20-33.188247.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_16T16_20_33.188247
path:
- '**/details_harness|winogrande|5_2023-10-16T16-20-33.188247.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-16T16-20-33.188247.parquet'
- config_name: results
data_files:
- split: 2023_10_16T16_20_33.188247
path:
- results_2023-10-16T16-20-33.188247.parquet
- split: latest
path:
- results_2023-10-16T16-20-33.188247.parquet
---
# Dataset Card for Evaluation run of lmsys/longchat-7b-v1.5-32k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lmsys/longchat-7b-v1.5-32k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lmsys/longchat-7b-v1.5-32k](https://huggingface.co/lmsys/longchat-7b-v1.5-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lmsys__longchat-7b-v1.5-32k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T16:20:33.188247](https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__longchat-7b-v1.5-32k/blob/main/results_2023-10-16T16-20-33.188247.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08252936241610738,
"em_stderr": 0.0028179934761829416,
"f1": 0.1372829278523486,
"f1_stderr": 0.0030245592633561815,
"acc": 0.3672124310289838,
"acc_stderr": 0.009455449816488642
},
"harness|drop|3": {
"em": 0.08252936241610738,
"em_stderr": 0.0028179934761829416,
"f1": 0.1372829278523486,
"f1_stderr": 0.0030245592633561815
},
"harness|gsm8k|5": {
"acc": 0.047763457164518575,
"acc_stderr": 0.005874387536229305
},
"harness|winogrande|5": {
"acc": 0.6866614048934491,
"acc_stderr": 0.01303651209674798
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Kevin-M-Smith/flint_images | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': clutter
'1': email
'2': email-squished
'3': handwritten-document
'4': spreadsheet
'5': typeset-document
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 178391248.0
num_examples: 4965
- name: test
num_bytes: 42819947.0
num_examples: 1242
download_size: 221040943
dataset_size: 221211195.0
---
# Dataset Card for "flint_images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_invalid-coder__Sakura-SOLAR-Instruct-CarbonVillain-en-10.7B-v2-slerp | ---
pretty_name: Evaluation run of invalid-coder/Sakura-SOLAR-Instruct-CarbonVillain-en-10.7B-v2-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [invalid-coder/Sakura-SOLAR-Instruct-CarbonVillain-en-10.7B-v2-slerp](https://huggingface.co/invalid-coder/Sakura-SOLAR-Instruct-CarbonVillain-en-10.7B-v2-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_invalid-coder__Sakura-SOLAR-Instruct-CarbonVillain-en-10.7B-v2-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-31T12:52:31.599271](https://huggingface.co/datasets/open-llm-leaderboard/details_invalid-coder__Sakura-SOLAR-Instruct-CarbonVillain-en-10.7B-v2-slerp/blob/main/results_2024-03-31T12-52-31.599271.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6668729078581164,\n\
\ \"acc_stderr\": 0.03160537007752537,\n \"acc_norm\": 0.6676492166554839,\n\
\ \"acc_norm_stderr\": 0.032248990418071485,\n \"mc1\": 0.5703794369645043,\n\
\ \"mc1_stderr\": 0.017329234580409095,\n \"mc2\": 0.7193856673502277,\n\
\ \"mc2_stderr\": 0.015001288229149994\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6843003412969283,\n \"acc_stderr\": 0.013582571095815291,\n\
\ \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.013226719056266125\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7125074686317466,\n\
\ \"acc_stderr\": 0.00451668195387909,\n \"acc_norm\": 0.8841864170483967,\n\
\ \"acc_norm_stderr\": 0.0031934725302821716\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.03163910665367291,\n\
\ \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.03163910665367291\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.039966295748767186,\n\
\ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.039966295748767186\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4973544973544973,\n \"acc_stderr\": 0.02575094967813039,\n \"\
acc_norm\": 0.4973544973544973,\n \"acc_norm_stderr\": 0.02575094967813039\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n\
\ \"acc_stderr\": 0.022185710092252252,\n \"acc_norm\": 0.8129032258064516,\n\
\ \"acc_norm_stderr\": 0.022185710092252252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.02921354941437217,\n \
\ \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.02921354941437217\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5833333333333334,\n \"acc_stderr\": 0.033622774366080424,\n \"\
acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.033622774366080424\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \
\ \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.0230866350868414,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.0230866350868414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n\
\ \"acc_stderr\": 0.014179171373424383,\n \"acc_norm\": 0.8045977011494253,\n\
\ \"acc_norm_stderr\": 0.014179171373424383\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7601156069364162,\n \"acc_stderr\": 0.022989592543123563,\n\
\ \"acc_norm\": 0.7601156069364162,\n \"acc_norm_stderr\": 0.022989592543123563\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39217877094972065,\n\
\ \"acc_stderr\": 0.016329061073207446,\n \"acc_norm\": 0.39217877094972065,\n\
\ \"acc_norm_stderr\": 0.016329061073207446\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7331189710610932,\n\
\ \"acc_stderr\": 0.025122637608816643,\n \"acc_norm\": 0.7331189710610932,\n\
\ \"acc_norm_stderr\": 0.025122637608816643\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0227797190887334,\n\
\ \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0227797190887334\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4954367666232073,\n\
\ \"acc_stderr\": 0.012769704263117522,\n \"acc_norm\": 0.4954367666232073,\n\
\ \"acc_norm_stderr\": 0.012769704263117522\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103128,\n\
\ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103128\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5703794369645043,\n\
\ \"mc1_stderr\": 0.017329234580409095,\n \"mc2\": 0.7193856673502277,\n\
\ \"mc2_stderr\": 0.015001288229149994\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8342541436464088,\n \"acc_stderr\": 0.010450899545370632\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6535253980288097,\n \
\ \"acc_stderr\": 0.013107179054313398\n }\n}\n```"
repo_url: https://huggingface.co/invalid-coder/Sakura-SOLAR-Instruct-CarbonVillain-en-10.7B-v2-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|arc:challenge|25_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|gsm8k|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hellaswag|10_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T12-52-31.599271.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-31T12-52-31.599271.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- '**/details_harness|winogrande|5_2024-03-31T12-52-31.599271.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-31T12-52-31.599271.parquet'
- config_name: results
data_files:
- split: 2024_03_31T12_52_31.599271
path:
- results_2024-03-31T12-52-31.599271.parquet
- split: latest
path:
- results_2024-03-31T12-52-31.599271.parquet
---
# Dataset Card for Evaluation run of invalid-coder/Sakura-SOLAR-Instruct-CarbonVillain-en-10.7B-v2-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [invalid-coder/Sakura-SOLAR-Instruct-CarbonVillain-en-10.7B-v2-slerp](https://huggingface.co/invalid-coder/Sakura-SOLAR-Instruct-CarbonVillain-en-10.7B-v2-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_invalid-coder__Sakura-SOLAR-Instruct-CarbonVillain-en-10.7B-v2-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-31T12:52:31.599271](https://huggingface.co/datasets/open-llm-leaderboard/details_invalid-coder__Sakura-SOLAR-Instruct-CarbonVillain-en-10.7B-v2-slerp/blob/main/results_2024-03-31T12-52-31.599271.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6668729078581164,
"acc_stderr": 0.03160537007752537,
"acc_norm": 0.6676492166554839,
"acc_norm_stderr": 0.032248990418071485,
"mc1": 0.5703794369645043,
"mc1_stderr": 0.017329234580409095,
"mc2": 0.7193856673502277,
"mc2_stderr": 0.015001288229149994
},
"harness|arc:challenge|25": {
"acc": 0.6843003412969283,
"acc_stderr": 0.013582571095815291,
"acc_norm": 0.712457337883959,
"acc_norm_stderr": 0.013226719056266125
},
"harness|hellaswag|10": {
"acc": 0.7125074686317466,
"acc_stderr": 0.00451668195387909,
"acc_norm": 0.8841864170483967,
"acc_norm_stderr": 0.0031934725302821716
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.75,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4973544973544973,
"acc_stderr": 0.02575094967813039,
"acc_norm": 0.4973544973544973,
"acc_norm_stderr": 0.02575094967813039
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.022185710092252252,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.022185710092252252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7184873949579832,
"acc_stderr": 0.02921354941437217,
"acc_norm": 0.7184873949579832,
"acc_norm_stderr": 0.02921354941437217
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.033622774366080424,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.033622774366080424
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.0230866350868414,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.0230866350868414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424383,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424383
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7601156069364162,
"acc_stderr": 0.022989592543123563,
"acc_norm": 0.7601156069364162,
"acc_norm_stderr": 0.022989592543123563
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39217877094972065,
"acc_stderr": 0.016329061073207446,
"acc_norm": 0.39217877094972065,
"acc_norm_stderr": 0.016329061073207446
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7331189710610932,
"acc_stderr": 0.025122637608816643,
"acc_norm": 0.7331189710610932,
"acc_norm_stderr": 0.025122637608816643
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0227797190887334,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0227797190887334
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4954367666232073,
"acc_stderr": 0.012769704263117522,
"acc_norm": 0.4954367666232073,
"acc_norm_stderr": 0.012769704263117522
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103128,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103128
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069446,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069446
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5703794369645043,
"mc1_stderr": 0.017329234580409095,
"mc2": 0.7193856673502277,
"mc2_stderr": 0.015001288229149994
},
"harness|winogrande|5": {
"acc": 0.8342541436464088,
"acc_stderr": 0.010450899545370632
},
"harness|gsm8k|5": {
"acc": 0.6535253980288097,
"acc_stderr": 0.013107179054313398
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ChavyvAkvar/chai-reward-kto-trainer-v1 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 65473456
num_examples: 33841
download_size: 37024405
dataset_size: 65473456
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
internlm/Agent-FLAN | ---
configs:
- config_name: default
data_files:
- split: agent_instruct_react
path: data/agent_instruct_react.jsonl
- split: agent_instruct_tflan
path: data/agent_instruct_tflan.jsonl
- split: toolbench_instruct_j1s1_3k
path: data/toolbench_instruct_j1s1_3k.jsonl
- split: toolbench_negative
path: data/toolbench_negative.jsonl
- split: toolbench_react_10p
path: data/toolbench_react_10p.jsonl
- split: toolbench_tflan_60p_r10r5u7
path: data/toolbench_tflan_60p_r10r5u7.jsonl
- split: toolbench_tflan_cot_30p
path: data/toolbench_tflan_cot_30p.jsonl
dataset_info:
features:
- name: conversation
list:
- name: role
dtype: string
- name: content
dtype: string
- name: loss
dtype: bool
- name: id
dtype: string
license: apache-2.0
tags:
- agent
pretty_name: AgentFLAN
---
# Agent-FLAN: Designing Data and Methods of Effective Agent Tuning for Large Language Models
This page holds the dataset proposed in Agent-FLAN, which consists of AgentInstruct, Toolbench, and customized negative agent samples as its source datasets.
## ✨ Introduction
[[🤗 HuggingFace](https://huggingface.co/internlm/Agent-FLAN-7b)]
[[📃 Paper](https://arxiv.org/abs/2403.12881)]
[[🌐 Project Page](https://internlm.github.io/Agent-FLAN/)]
> Open-sourced Large Language Models (LLMs) have achieved great success in various NLP tasks, however, they are still far inferior to API-based models when acting as agents. How to integrate agent ability into general LLMs becomes a crucial and urgent problem. This paper first delivers three key observations: (1) the current agent training corpus is entangled with both formats following and agent reasoning, which significantly shifts from the distribution of its pre-training data; (2) LLMs exhibit different learning speeds on the capabilities required by agent tasks; and (3) current approaches have side-effects when improving agent abilities by introducing hallucinations. Based on the above findings, we propose Agent-FLAN to effectively Fine-tune LANguage models for Agents. Through careful decomposition and redesign of the training corpus, Agent-FLAN enables Llama2-7B to outperform prior best works by 3.5% across various agent evaluation datasets. With comprehensively constructed negative samples, Agent-FLAN greatly alleviates the hallucination issues based on our established evaluation benchmark. Besides, it consistently improves the agent capability of LLMs when scaling model sizes while slightly enhancing the general capability of LLMs.
## ♟️ Agent-FLAN
Agent-FLAN series are finetuned on AgentInstruct and Toolbench by applying the data generation pipeline proposed in Agent-FLAN paper, which holds strong abilities on various agent tasks and tool utilization~
### 🤗 HuggingFace Dataset
Agent-FLAN is produced by mixed training on AgentInstruct, ToolBench, and ShareGPT datasets from the Llama2-chat series.
The models follow the conversation format of Llama-2-chat, with the template protocol as:
```python
dict(role='user', begin='<|Human|>െ', end='\n '),
dict(role='system', begin='<|Human|>െ', end='\n '),
dict(role='assistant', begin='<|Assistant|>െ', end='ി\n '),
```
## ❤️ Acknowledgements
Agent-FLAN is built with [Lagent](https://github.com/InternLM/lagent) and [T-Eval](https://github.com/open-compass/t-eval). Thanks for their awesome work!
## 🖊️ Citation
If you find this project useful in your research, please consider citing:
```
@article{chen2024agent,
title={Agent-FLAN: Designing Data and Methods of Effective Agent Tuning for Large Language Models},
author={Chen, Zehui and Liu, Kuikun and Wang, Qiuchen and Liu, Jiangning and Zhang, Wenwei and Lin, Dahua and Chen, Kai and Zhao, Feng},
journal={arXiv preprint arXiv:2403.12881},
year={2024}
}
```
## 💳 License
This project is released under the Apache 2.0 [license](./LICENSE). |
adityarra07/train_data_5000 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 842563984.5081627
num_examples: 5000
- name: test
num_bytes: 33702427.98032651
num_examples: 200
download_size: 859088007
dataset_size: 876266412.4884893
---
# Dataset Card for "train_data_5000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jinxu95/ullava | ---
license: apache-2.0
---
|
pawlo2013/anime_diffusion_full | ---
dataset_info:
features:
- name: sketch
dtype: image
- name: full_colour
dtype: image
- name: scribbles
dtype: image
- name: sketch_and_scribbles_merged
dtype: image
splits:
- name: train
num_bytes: 4923366050.448
num_examples: 40869
- name: test
num_bytes: 97418307.0
num_examples: 301
download_size: 4996697792
dataset_size: 5020784357.448
---
# Dataset Card for "anime_diffusion_full"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
louisbrulenaudet/code-electoral | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code électoral
source_datasets:
- original
pretty_name: Code électoral
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code électoral, non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
noazera/minhavoz | ---
license: openrail
---
|
AyoubChLin/20News_group | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
- name: labels
dtype: int64
splits:
- name: train
num_bytes: 41896639
num_examples: 19850
download_size: 24928143
dataset_size: 41896639
task_categories:
- text-classification
language:
- en
size_categories:
- n<1K
--- |
SonishMaharjan/asr_nepali_0 | ---
license: unknown
---
|
sahibnanda/DragonBall-SS-Captions | ---
license: mit
dataset_info:
features:
- name: caption
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 53009391.0
num_examples: 146
download_size: 51904159
dataset_size: 53009391.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gate369/alpaca-star-ascii | ---
license: apache-2.0
---
same as the original alpaca star, this one however encourages to include a mental image.
it will out put a flow chart or ascii image for each prompt |
Deressa/GenconViT | ---
license: cc-by-nc-4.0
---
|
ShrinivasSK/hi_te | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 5294133.0
num_examples: 18000
- name: test
num_bytes: 588237.0
num_examples: 2000
download_size: 2685371
dataset_size: 5882370.0
---
# Dataset Card for "hi-te"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CATIE-AQ/wikineural_fr_prompt_ner | ---
language:
- fr
license:
- unknown
size_categories:
- 1M<n<10M
task_categories:
- token-classification
tags:
- ner
- DFP
- french prompts
annotations_creators:
- found
language_creators:
- found
multilinguality:
- monolingual
source_datasets:
- wikineural
---
# wikineural_fr_prompt_ner
## Summary
**wikineural_fr_prompt_ner** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP).
It contains **2,647,638** rows that can be used for a name entity recognition task.
The original data (without prompts) comes from the dataset [wikineural](https://huggingface.co/datasets/tner/wikineural) by Tedeschi et al. where only the French part has been kept.
A list of prompts (see below) was then applied in order to build the input and target columns and thus obtain the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al.
## Prompts used
### List
21 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement.
```
'Extraire les entités nommées du texte suivant : '+text,
'Extrais les entités nommées du texte suivant : '+text,
'Extrayez les entités nommées du texte suivant : '+text,
'Isoler les entités nommées du texte suivant : '+text,
'Isole les entités nommées du texte suivant : '+text,
'Isolez les entités nommées du texte suivant : '+text,
'Dégager des entités nommées dans le texte : '+text,
'Dégage des entités nommées dans le texte : '+text,
'Dégagez des entités nommées dans le texte : '+text,
'Générer des entités nommées issues du texte suivant : '+text,
'Génère des entités nommées issues du texte suivant : '+text,
'Générez des entités nommées issues du texte suivant : '+text,
'Trouver les entités nommées du texte : '+text,
'Trouve les entités nommées du texte : '+text,
'Trouvez les entités nommées du texte : '+text,
'Repérer les entités nommées présentes dans le texte suivant : '+text,
'Repère les entités nommées présentes dans le texte suivant : '+text,
'Repérez les entités nommées présentes dans le texte suivant : '+text,
'Indiquer les entités nommées du texte :'+text,
'Indique les entités nommées du texte : '+text,
'Indiquez les entités nommées du texte : '+text
```
### Features used in the prompts
In the prompt list above, `text` and `targets` have been constructed from:
```
wikineural = load_dataset('tner/wikineural','fr')
wikineural['train']['tokens'] = list(map(lambda i: ' '.join(wikineural['train']['tokens'][i]), range(len(wikineural['train']['tokens']))))
wikineural['train']['tags'] = list(map(lambda x: x.replace("[","").replace("]","").replace('10','O').replace('11','O').replace('12','O').replace('13','O').replace('14','O').replace('15','O').replace('16','O').replace('17','O').replace('18','O').replace('19','O').replace('20','O').replace('21','O').replace('22','O').replace('23','O').replace('24','O').replace('25','O').replace('26','O').replace('27','O').replace('28','O').replace('29','O').replace('30','O').replace('31','O').replace('32','O').replace('0','O').replace('1','B-PER').replace('2','I-PER').replace('3','B-LOC').replace('4','I-LOC').replace('5','B-ORG').replace('6','I-ORG').replace('7','O').replace('8','O').replace('9','O'), map(str, wikineural['train']['tags'])))
```
# Splits
- `train` with 2,116,800 samples
- `valid` with 264,600 samples
- `test` with 266,238 samples
# How to use?
```
from datasets import load_dataset
dataset = load_dataset("CATIE-AQ/wikineural_fr_prompt_ner")
```
# Citation
## Original data
> @inproceedings{tedeschi-etal-2021-wikineural-combined,
title = "{W}iki{NE}u{R}al: {C}ombined Neural and Knowledge-based Silver Data Creation for Multilingual {NER}",
author = "Tedeschi, Simone and Maiorca, Valentino and Campolungo, Niccol{\`o} and Cecconi, Francesco and Navigli, Roberto",
booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2021",
month = nov,
year = "2021",
address = "Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.findings-emnlp.215",
doi = "10.18653/v1/2021.findings-emnlp.215",
pages = "2521--2533",
}
## This Dataset
> @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023,
author = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { DFP (Revision 1d24c09) },
year = 2023,
url = { https://huggingface.co/datasets/CATIE-AQ/DFP },
doi = { 10.57967/hf/1200 },
publisher = { Hugging Face }
}
## License
Unknow |
open-llm-leaderboard/details_gradientai__v-alpha-tross | ---
pretty_name: Evaluation run of gradientai/v-alpha-tross
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gradientai/v-alpha-tross](https://huggingface.co/gradientai/v-alpha-tross) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gradientai__v-alpha-tross\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-25T10:09:36.875162](https://huggingface.co/datasets/open-llm-leaderboard/details_gradientai__v-alpha-tross/blob/main/results_2024-01-25T10-09-36.875162.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7035931528833435,\n\
\ \"acc_stderr\": 0.030352450067599814,\n \"acc_norm\": 0.7068411118887601,\n\
\ \"acc_norm_stderr\": 0.030946357887990564,\n \"mc1\": 0.4785801713586291,\n\
\ \"mc1_stderr\": 0.01748743214471181,\n \"mc2\": 0.6521377907591408,\n\
\ \"mc2_stderr\": 0.014862743520821054\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6732081911262798,\n \"acc_stderr\": 0.013706665975587333,\n\
\ \"acc_norm\": 0.7192832764505119,\n \"acc_norm_stderr\": 0.01313123812697558\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6760605457080263,\n\
\ \"acc_stderr\": 0.0046702081285792335,\n \"acc_norm\": 0.8681537542322246,\n\
\ \"acc_norm_stderr\": 0.0033763209559167064\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8472222222222222,\n\
\ \"acc_stderr\": 0.030085743248565663,\n \"acc_norm\": 0.8472222222222222,\n\
\ \"acc_norm_stderr\": 0.030085743248565663\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6851063829787234,\n \"acc_stderr\": 0.030363582197238174,\n\
\ \"acc_norm\": 0.6851063829787234,\n \"acc_norm_stderr\": 0.030363582197238174\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.039966295748767186,\n\
\ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.039966295748767186\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4523809523809524,\n \"acc_stderr\": 0.025634258115554955,\n \"\
acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.025634258115554955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8225806451612904,\n\
\ \"acc_stderr\": 0.021732540689329286,\n \"acc_norm\": 0.8225806451612904,\n\
\ \"acc_norm_stderr\": 0.021732540689329286\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8888888888888888,\n \"acc_stderr\": 0.022390787638216773,\n \"\
acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.022390787638216773\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678175,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678175\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7076923076923077,\n \"acc_stderr\": 0.02306043838085774,\n \
\ \"acc_norm\": 0.7076923076923077,\n \"acc_norm_stderr\": 0.02306043838085774\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7773109243697479,\n \"acc_stderr\": 0.02702543349888239,\n \
\ \"acc_norm\": 0.7773109243697479,\n \"acc_norm_stderr\": 0.02702543349888239\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248436,\n \"\
acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248436\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8880733944954129,\n \"acc_stderr\": 0.013517352714958792,\n \"\
acc_norm\": 0.8880733944954129,\n \"acc_norm_stderr\": 0.013517352714958792\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6157407407407407,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.6157407407407407,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073315,\n \"\
acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073315\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8776371308016878,\n \"acc_stderr\": 0.021331741829746793,\n \
\ \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.021331741829746793\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n\
\ \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n\
\ \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597446,\n\
\ \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597446\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.02919980245562281,\n \"\
acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.02919980245562281\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.03004735765580663,\n\
\ \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.03004735765580663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n\
\ \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n\
\ \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822582,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822582\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8697318007662835,\n\
\ \"acc_stderr\": 0.012036729568216052,\n \"acc_norm\": 0.8697318007662835,\n\
\ \"acc_norm_stderr\": 0.012036729568216052\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7716763005780347,\n \"acc_stderr\": 0.022598703804321628,\n\
\ \"acc_norm\": 0.7716763005780347,\n \"acc_norm_stderr\": 0.022598703804321628\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5698324022346368,\n\
\ \"acc_stderr\": 0.01655860163604103,\n \"acc_norm\": 0.5698324022346368,\n\
\ \"acc_norm_stderr\": 0.01655860163604103\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.024288619466046105,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.024288619466046105\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7684887459807074,\n\
\ \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.7684887459807074,\n\
\ \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8209876543209876,\n \"acc_stderr\": 0.02133086876212706,\n\
\ \"acc_norm\": 0.8209876543209876,\n \"acc_norm_stderr\": 0.02133086876212706\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5425531914893617,\n \"acc_stderr\": 0.029719281272236834,\n \
\ \"acc_norm\": 0.5425531914893617,\n \"acc_norm_stderr\": 0.029719281272236834\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5541069100391134,\n\
\ \"acc_stderr\": 0.012695244711379783,\n \"acc_norm\": 0.5541069100391134,\n\
\ \"acc_norm_stderr\": 0.012695244711379783\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.026303648393696036,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.026303648393696036\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.017160587235046352,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.017160587235046352\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8040816326530612,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.8040816326530612,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n\
\ \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n\
\ \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4785801713586291,\n\
\ \"mc1_stderr\": 0.01748743214471181,\n \"mc2\": 0.6521377907591408,\n\
\ \"mc2_stderr\": 0.014862743520821054\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.0104108497752228\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6178923426838514,\n \
\ \"acc_stderr\": 0.013384173935648494\n }\n}\n```"
repo_url: https://huggingface.co/gradientai/v-alpha-tross
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|arc:challenge|25_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|arc:challenge|25_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|gsm8k|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|gsm8k|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hellaswag|10_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hellaswag|10_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T13-42-12.602849.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T10-09-36.875162.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T10-09-36.875162.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- '**/details_harness|winogrande|5_2024-01-24T13-42-12.602849.parquet'
- split: 2024_01_25T10_09_36.875162
path:
- '**/details_harness|winogrande|5_2024-01-25T10-09-36.875162.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-25T10-09-36.875162.parquet'
- config_name: results
data_files:
- split: 2024_01_24T13_42_12.602849
path:
- results_2024-01-24T13-42-12.602849.parquet
- split: 2024_01_25T10_09_36.875162
path:
- results_2024-01-25T10-09-36.875162.parquet
- split: latest
path:
- results_2024-01-25T10-09-36.875162.parquet
---
# Dataset Card for Evaluation run of gradientai/v-alpha-tross
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gradientai/v-alpha-tross](https://huggingface.co/gradientai/v-alpha-tross) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gradientai__v-alpha-tross",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-25T10:09:36.875162](https://huggingface.co/datasets/open-llm-leaderboard/details_gradientai__v-alpha-tross/blob/main/results_2024-01-25T10-09-36.875162.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7035931528833435,
"acc_stderr": 0.030352450067599814,
"acc_norm": 0.7068411118887601,
"acc_norm_stderr": 0.030946357887990564,
"mc1": 0.4785801713586291,
"mc1_stderr": 0.01748743214471181,
"mc2": 0.6521377907591408,
"mc2_stderr": 0.014862743520821054
},
"harness|arc:challenge|25": {
"acc": 0.6732081911262798,
"acc_stderr": 0.013706665975587333,
"acc_norm": 0.7192832764505119,
"acc_norm_stderr": 0.01313123812697558
},
"harness|hellaswag|10": {
"acc": 0.6760605457080263,
"acc_stderr": 0.0046702081285792335,
"acc_norm": 0.8681537542322246,
"acc_norm_stderr": 0.0033763209559167064
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8157894736842105,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.8157894736842105,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8472222222222222,
"acc_stderr": 0.030085743248565663,
"acc_norm": 0.8472222222222222,
"acc_norm_stderr": 0.030085743248565663
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6851063829787234,
"acc_stderr": 0.030363582197238174,
"acc_norm": 0.6851063829787234,
"acc_norm_stderr": 0.030363582197238174
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.025634258115554955,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.025634258115554955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.021732540689329286,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.021732540689329286
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.022390787638216773,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.022390787638216773
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678175,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678175
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7076923076923077,
"acc_stderr": 0.02306043838085774,
"acc_norm": 0.7076923076923077,
"acc_norm_stderr": 0.02306043838085774
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7773109243697479,
"acc_stderr": 0.02702543349888239,
"acc_norm": 0.7773109243697479,
"acc_norm_stderr": 0.02702543349888239
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8880733944954129,
"acc_stderr": 0.013517352714958792,
"acc_norm": 0.8880733944954129,
"acc_norm_stderr": 0.013517352714958792
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6157407407407407,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.6157407407407407,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073315,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073315
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.021331741829746793,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.021331741829746793
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597446,
"acc_norm": 0.8854961832061069,
"acc_norm_stderr": 0.027927473753597446
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.02919980245562281,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.02919980245562281
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.03004735765580663,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.03004735765580663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719098,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719098
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822582,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822582
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8697318007662835,
"acc_stderr": 0.012036729568216052,
"acc_norm": 0.8697318007662835,
"acc_norm_stderr": 0.012036729568216052
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7716763005780347,
"acc_stderr": 0.022598703804321628,
"acc_norm": 0.7716763005780347,
"acc_norm_stderr": 0.022598703804321628
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5698324022346368,
"acc_stderr": 0.01655860163604103,
"acc_norm": 0.5698324022346368,
"acc_norm_stderr": 0.01655860163604103
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.024288619466046105,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.024288619466046105
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7684887459807074,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.7684887459807074,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8209876543209876,
"acc_stderr": 0.02133086876212706,
"acc_norm": 0.8209876543209876,
"acc_norm_stderr": 0.02133086876212706
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5425531914893617,
"acc_stderr": 0.029719281272236834,
"acc_norm": 0.5425531914893617,
"acc_norm_stderr": 0.029719281272236834
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5541069100391134,
"acc_stderr": 0.012695244711379783,
"acc_norm": 0.5541069100391134,
"acc_norm_stderr": 0.012695244711379783
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.75,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.75,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.017160587235046352,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.017160587235046352
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8040816326530612,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.8040816326530612,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4785801713586291,
"mc1_stderr": 0.01748743214471181,
"mc2": 0.6521377907591408,
"mc2_stderr": 0.014862743520821054
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.0104108497752228
},
"harness|gsm8k|5": {
"acc": 0.6178923426838514,
"acc_stderr": 0.013384173935648494
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Yixian-Lu/NER_SASB | ---
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags_original
sequence: string
- name: ner_tags_number
sequence: int64
splits:
- name: train
num_bytes: 401597
num_examples: 250
- name: validation
num_bytes: 17311
num_examples: 14
- name: test
num_bytes: 34061
num_examples: 29
download_size: 0
dataset_size: 452969
---
# Dataset Card for "NER_SASB"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/hayanami_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hayanami (Kantai Collection)
This is the dataset of hayanami (Kantai Collection), containing 171 images and their tags.
The core tags of this character are `ribbon, hair_ribbon, ahoge, short_hair, sidelocks, white_ribbon, yellow_eyes, purple_hair, grey_hair, side_ponytail, bow, ponytail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 171 | 163.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hayanami_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 171 | 98.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hayanami_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 369 | 207.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hayanami_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 171 | 145.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hayanami_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 369 | 282.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hayanami_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hayanami_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 42 |  |  |  |  |  | 1girl, school_uniform, solo, white_shirt, purple_dress, long_sleeves, looking_at_viewer, pleated_dress, blue_bowtie, smile, bangs, simple_background, open_mouth, sleeveless_dress, white_background, black_hair, multicolored_hair, upper_body, blush, aqua_bowtie |
| 1 | 8 |  |  |  |  |  | bowtie, lace-up_boots, school_uniform, solo, white_shirt, 1girl, white_background, full_body, long_sleeves, pleated_dress, purple_dress, seamed_legwear, simple_background, character_name, grey_thighhighs, aqua_bow, black_hair, blue_bow, chibi, halterneck, rigging, sleeveless_dress, standing |
| 2 | 19 |  |  |  |  |  | 1girl, solo, alternate_costume, gradient_hair, long_sleeves, looking_at_viewer, simple_background, jacket, shirt, white_background, blush, dated, one-hour_drawing_challenge, red_sweater, white_skirt, bag, holding |
| 3 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, solo, cowboy_shot, simple_background, small_breasts, covered_navel, smile, white_background, alternate_costume, blue_one-piece_swimsuit, competition_school_swimsuit, gradient_hair, new_school_swimsuit, one-hour_drawing_challenge, open_mouth, thigh_gap |
| 4 | 14 |  |  |  |  |  | fake_animal_ears, 1girl, solo, detached_collar, playboy_bunny, rabbit_ears, wrist_cuffs, purple_leotard, strapless_leotard, rabbit_tail, simple_background, white_background, blue_bowtie, thighhighs, black_hair, adapted_costume, cowboy_shot, fake_tail, fishnet_pantyhose, open_mouth, small_breasts, thighband_pantyhose |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | school_uniform | solo | white_shirt | purple_dress | long_sleeves | looking_at_viewer | pleated_dress | blue_bowtie | smile | bangs | simple_background | open_mouth | sleeveless_dress | white_background | black_hair | multicolored_hair | upper_body | blush | aqua_bowtie | bowtie | lace-up_boots | full_body | seamed_legwear | character_name | grey_thighhighs | aqua_bow | blue_bow | chibi | halterneck | rigging | standing | alternate_costume | gradient_hair | jacket | shirt | dated | one-hour_drawing_challenge | red_sweater | white_skirt | bag | holding | cowboy_shot | small_breasts | covered_navel | blue_one-piece_swimsuit | competition_school_swimsuit | new_school_swimsuit | thigh_gap | fake_animal_ears | detached_collar | playboy_bunny | rabbit_ears | wrist_cuffs | purple_leotard | strapless_leotard | rabbit_tail | thighhighs | adapted_costume | fake_tail | fishnet_pantyhose | thighband_pantyhose |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------|:--------------|:---------------|:---------------|:--------------------|:----------------|:--------------|:--------|:--------|:--------------------|:-------------|:-------------------|:-------------------|:-------------|:--------------------|:-------------|:--------|:--------------|:---------|:----------------|:------------|:-----------------|:-----------------|:------------------|:-----------|:-----------|:--------|:-------------|:----------|:-----------|:--------------------|:----------------|:---------|:--------|:--------|:-----------------------------|:--------------|:--------------|:------|:----------|:--------------|:----------------|:----------------|:--------------------------|:------------------------------|:----------------------|:------------|:-------------------|:------------------|:----------------|:--------------|:--------------|:-----------------|:--------------------|:--------------|:-------------|:------------------|:------------|:--------------------|:----------------------|
| 0 | 42 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | X | X | | X | | | | X | | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 19 |  |  |  |  |  | X | | X | | | X | X | | | | | X | | | X | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | X | | | | X | | | X | | X | X | | X | | | | | | | | | | | | | | | | | | X | X | | | | X | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 4 | 14 |  |  |  |  |  | X | | X | | | | | | X | | | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
pyakymenko/test_6k | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 475682224.444
num_examples: 6661
download_size: 473720429
dataset_size: 475682224.444
---
# Dataset Card for "test_6k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anti-ai/vi_mc4_biology_wseg | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4437212506
num_examples: 1985952
download_size: 2281680912
dataset_size: 4437212506
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
language:
- vi
pretty_name: vi_biology
size_categories:
- 1M<n<10M
tags:
- biology
- medical
---
# Dataset Card for "vi_mc4_biology_wseg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sunbird/salt-studio-ach | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: audio
sequence: float32
- name: audio_language
dtype: string
- name: is_studio
dtype: bool
- name: speaker_id
dtype: string
- name: sample_rate
dtype: int64
splits:
- name: train
num_bytes: 1347657571
num_examples: 4801
- name: dev
num_bytes: 27757006
num_examples: 101
- name: test
num_bytes: 26447305
num_examples: 96
download_size: 702558860
dataset_size: 1401861882
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
|
Tonyhacker/james | ---
license: openrail
---
|
pietrolesci/wikitext-103-raw-v1_gpt2-20k | ---
dataset_info:
- config_name: packed
features:
- name: input_ids
sequence: int64
- name: uid
dtype: int64
splits:
- name: test
num_bytes: 2313688
num_examples: 1124
- name: train
num_bytes: 968728180
num_examples: 470257
- name: validation
num_bytes: 2027524
num_examples: 985
download_size: 0
dataset_size: 973069392
- config_name: surprisals
features:
- name: surprisals
sequence: float64
- name: token_ids
sequence: int64
- name: uids
dtype: int64
- name: batch_idx
dtype: int64
- name: step
dtype: int64
splits:
- name: step_10999_validation
num_bytes: 4050320
num_examples: 985
- name: step_10999_train
num_bytes: 1933696784
num_examples: 470257
- name: step_3999_validation
num_bytes: 4050320
num_examples: 985
- name: step_3999_train
num_bytes: 1933696784
num_examples: 470257
- name: step_0_validation
num_bytes: 4050320
num_examples: 985
- name: step_0_train
num_bytes: 1933696784
num_examples: 470257
- name: step_999_validation
num_bytes: 4050320
num_examples: 985
- name: step_999_train
num_bytes: 1933696784
num_examples: 470257
- name: step_4999_train
num_bytes: 1933696784
num_examples: 470257
- name: step_4999_validation
num_bytes: 4050320
num_examples: 985
- name: step_1999_train
num_bytes: 1933696784
num_examples: 470257
- name: step_1999_validation
num_bytes: 4050320
num_examples: 985
- name: train
num_bytes: 1933696784
num_examples: 470257
- name: step_8999_train
num_bytes: 1933696784
num_examples: 470257
- name: step_8999_validation
num_bytes: 4050320
num_examples: 985
- name: step_7999_train
num_bytes: 1933696784
num_examples: 470257
- name: step_7999_validation
num_bytes: 4050320
num_examples: 985
- name: step_13999_train
num_bytes: 1933696784
num_examples: 470257
- name: step_13999_validation
num_bytes: 4050320
num_examples: 985
- name: step_2999_validation
num_bytes: 4050320
num_examples: 985
- name: step_2999_train
num_bytes: 1933696784
num_examples: 470257
- name: step_11999_train
num_bytes: 1933696784
num_examples: 470257
- name: step_11999_validation
num_bytes: 4050320
num_examples: 985
- name: step_12999_validation
num_bytes: 4050320
num_examples: 985
- name: step_12999_train
num_bytes: 1933696784
num_examples: 470257
- name: step_6999_train
num_bytes: 1933696784
num_examples: 470257
- name: step_6999_validation
num_bytes: 4050320
num_examples: 985
- name: step_9999_train
num_bytes: 1933696784
num_examples: 470257
- name: step_9999_validation
num_bytes: 4050320
num_examples: 985
- name: step_5999_validation
num_bytes: 4050320
num_examples: 985
- name: step_5999_train
num_bytes: 1933696784
num_examples: 470257
download_size: 21176694739
dataset_size: 30999903344
configs:
- config_name: packed
data_files:
- split: test
path: packed/test-*
- split: train
path: packed/train-*
- split: validation
path: packed/validation-*
- config_name: surprisals
data_files:
- split: step_10999_validation
path: surprisals/step_10999_validation-*
- split: step_10999_train
path: surprisals/step_10999_train-*
- split: step_3999_validation
path: surprisals/step_3999_validation-*
- split: step_3999_train
path: surprisals/step_3999_train-*
- split: step_0_validation
path: surprisals/step_0_validation-*
- split: step_0_train
path: surprisals/step_0_train-*
- split: step_999_validation
path: surprisals/step_999_validation-*
- split: step_999_train
path: surprisals/step_999_train-*
- split: step_4999_train
path: surprisals/step_4999_train-*
- split: step_4999_validation
path: surprisals/step_4999_validation-*
- split: step_1999_train
path: surprisals/step_1999_train-*
- split: step_1999_validation
path: surprisals/step_1999_validation-*
- split: train
path: surprisals/train-*
- split: step_8999_train
path: surprisals/step_8999_train-*
- split: step_8999_validation
path: surprisals/step_8999_validation-*
- split: step_7999_train
path: surprisals/step_7999_train-*
- split: step_7999_validation
path: surprisals/step_7999_validation-*
- split: step_13999_train
path: surprisals/step_13999_train-*
- split: step_13999_validation
path: surprisals/step_13999_validation-*
- split: step_2999_validation
path: surprisals/step_2999_validation-*
- split: step_2999_train
path: surprisals/step_2999_train-*
- split: step_11999_train
path: surprisals/step_11999_train-*
- split: step_11999_validation
path: surprisals/step_11999_validation-*
- split: step_12999_validation
path: surprisals/step_12999_validation-*
- split: step_12999_train
path: surprisals/step_12999_train-*
- split: step_6999_train
path: surprisals/step_6999_train-*
- split: step_6999_validation
path: surprisals/step_6999_validation-*
- split: step_9999_train
path: surprisals/step_9999_train-*
- split: step_9999_validation
path: surprisals/step_9999_validation-*
- split: step_5999_validation
path: surprisals/step_5999_validation-*
- split: step_5999_train
path: surprisals/step_5999_train-*
---
# Dataset Card for "wikitext-103-raw-v1_gpt2-20k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cakiki/arxiv-pyserini | ---
dataset_info:
features:
- name: id
dtype: string
- name: submitter
dtype: string
- name: authors
dtype: string
- name: title
dtype: string
- name: comments
dtype: string
- name: journal-ref
dtype: string
- name: doi
dtype: string
- name: report-no
dtype: string
- name: categories
dtype: string
- name: license
dtype: string
- name: abstract
dtype: string
- name: versions
list:
- name: created
dtype: string
- name: version
dtype: string
- name: update_date
dtype: string
- name: authors_parsed
sequence:
sequence: string
splits:
- name: train
num_bytes: 3217788413
num_examples: 2171090
download_size: 1801274080
dataset_size: 3217788413
---
# Dataset Card for "arxiv-pyserini"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Andyrasika/alpaca-bitcoin-sentiment-dataset | ---
license: apache-2.0
---
|
fredmo/vertexai-qna-50 | ---
license: unknown
---
|
huggingartists/hyuna | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/hyuna"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.25468 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/e802afac5a0100ca75e520f954182f73.1000x1000x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/hyuna">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">HyunA (현아)</div>
<a href="https://genius.com/artists/hyuna">
<div style="text-align: center; font-size: 14px;">@hyuna</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/hyuna).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/hyuna")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|106| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/hyuna")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
trevahok/test | ---
dataset_info:
features:
- name: feedback
dtype: string
- name: generations
dtype: string
- name: input
dtype: string
- name: labelling_model
dtype: string
- name: labelling_prompt
dtype: string
- name: raw_labelling_response
dtype: string
- name: critique
dtype: 'null'
- name: score
dtype: 'null'
splits:
- name: train
num_bytes: 6039711
num_examples: 1000
download_size: 2645043
dataset_size: 6039711
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Used UltraCM to feedback on feedback-collection dataset.
So now we have a reference and generated feedback. |
AdaptLLM/ConvFinQA | ---
configs:
- config_name: ConvFinQA
data_files:
- split: train
path: train_turn.json
- split: validation
path: dev_turn.json
task_categories:
- text-classification
- question-answering
- zero-shot-classification
language:
- en
tags:
- finance
---
# Domain Adaptation of Large Language Models
This repo contains the **ConvFinQA dataset** used in our **ICLR 2024** paper [Adapting Large Language Models via Reading Comprehension](https://huggingface.co/papers/2309.09530).
We explore **continued pre-training on domain-specific corpora** for large language models. While this approach enriches LLMs with domain knowledge, it significantly hurts their prompting ability for question answering. Inspired by human learning via reading comprehension, we propose a simple method to **transform large-scale pre-training corpora into reading comprehension texts**, consistently improving prompting performance across tasks in biomedicine, finance, and law domains. **Our 7B model competes with much larger domain-specific models like BloombergGPT-50B**.
### 🤗 We are currently working hard on developing models across different domains, scales and architectures! Please stay tuned! 🤗
**************************** **Updates** ****************************
* 2024/4/2: Released the raw data splits (train and test) of all the evaluation datasets
* 2024/1/16: 🎉 Our [research paper](https://huggingface.co/papers/2309.09530) has been accepted by ICLR 2024!!!🎉
* 2023/12/19: Released our [13B base models](https://huggingface.co/AdaptLLM/law-LLM-13B) developed from LLaMA-1-13B.
* 2023/12/8: Released our [chat models](https://huggingface.co/AdaptLLM/law-chat) developed from LLaMA-2-Chat-7B.
* 2023/9/18: Released our [paper](https://huggingface.co/papers/2309.09530), [code](https://github.com/microsoft/LMOps), [data](https://huggingface.co/datasets/AdaptLLM/law-tasks), and [base models](https://huggingface.co/AdaptLLM/law-LLM) developed from LLaMA-1-7B.
## Domain-Specific LLaMA-1
### LLaMA-1-7B
In our paper, we develop three domain-specific models from LLaMA-1-7B, which are also available in Huggingface: [Biomedicine-LLM](https://huggingface.co/AdaptLLM/medicine-LLM), [Finance-LLM](https://huggingface.co/AdaptLLM/finance-LLM) and [Law-LLM](https://huggingface.co/AdaptLLM/law-LLM), the performances of our AdaptLLM compared to other domain-specific LLMs are:
<p align='center'>
<img src="https://cdn-uploads.huggingface.co/production/uploads/650801ced5578ef7e20b33d4/6efPwitFgy-pLTzvccdcP.png" width="700">
</p>
### LLaMA-1-13B
Moreover, we scale up our base model to LLaMA-1-13B to see if **our method is similarly effective for larger-scale models**, and the results are consistently positive too: [Biomedicine-LLM-13B](https://huggingface.co/AdaptLLM/medicine-LLM-13B), [Finance-LLM-13B](https://huggingface.co/AdaptLLM/finance-LLM-13B) and [Law-LLM-13B](https://huggingface.co/AdaptLLM/law-LLM-13B).
## Domain-Specific LLaMA-2-Chat
Our method is also effective for aligned models! LLaMA-2-Chat requires a [specific data format](https://huggingface.co/blog/llama2#how-to-prompt-llama-2), and our **reading comprehension can perfectly fit the data format** by transforming the reading comprehension into a multi-turn conversation. We have also open-sourced chat models in different domains: [Biomedicine-Chat](https://huggingface.co/AdaptLLM/medicine-chat), [Finance-Chat](https://huggingface.co/AdaptLLM/finance-chat) and [Law-Chat](https://huggingface.co/AdaptLLM/law-chat)
## Domain-Specific Tasks
### Pre-templatized/Formatted Testing Splits
To easily reproduce our prompting results, we have uploaded the filled-in zero/few-shot input instructions and output completions of the test each domain-specific task: [biomedicine-tasks](https://huggingface.co/datasets/AdaptLLM/medicine-tasks), [finance-tasks](https://huggingface.co/datasets/AdaptLLM/finance-tasks), and [law-tasks](https://huggingface.co/datasets/AdaptLLM/law-tasks).
**Note:** those filled-in instructions are specifically tailored for models before alignment and do NOT fit for the specific data format required for chat models.
### Raw Datasets
We have also uploaded the raw training and testing splits, for facilitating fine-tuning or other usages:
- [ChemProt](https://huggingface.co/datasets/AdaptLLM/ChemProt)
- [RCT](https://huggingface.co/datasets/AdaptLLM/RCT)
- [ConvFinQA](https://huggingface.co/datasets/AdaptLLM/ConvFinQA)
- [FiQA_SA](https://huggingface.co/datasets/AdaptLLM/FiQA_SA)
- [Headline](https://huggingface.co/datasets/AdaptLLM/Headline)
- [NER](https://huggingface.co/datasets/AdaptLLM/NER)
- [FPB](https://huggingface.co/datasets/AdaptLLM/FPB)
The other datasets used in our paper have already been available in huggingface, and you can directly load them with the following code:
```python
from datasets import load_dataset
# MQP:
dataset = load_dataset('medical_questions_pairs')
# PubmedQA:
dataset = load_dataset('bigbio/pubmed_qa')
# USMLE:
dataset=load_dataset('GBaker/MedQA-USMLE-4-options')
# SCOTUS
dataset = load_dataset("lex_glue", 'scotus')
# CaseHOLD
dataset = load_dataset("lex_glue", 'case_hold')
# UNFAIR-ToS
dataset = load_dataset("lex_glue", 'unfair_tos')
```
## Citation
If you find our work helpful, please cite us:
```bibtex
@inproceedings{
cheng2024adapting,
title={Adapting Large Language Models via Reading Comprehension},
author={Daixuan Cheng and Shaohan Huang and Furu Wei},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=y886UXPEZ0}
}
```
and the original dataset:
```bibtex
@inproceedings{ConvFinQA,
author = {Zhiyu Chen and
Shiyang Li and
Charese Smiley and
Zhiqiang Ma and
Sameena Shah and
William Yang Wang},
title = {ConvFinQA: Exploring the Chain of Numerical Reasoning in Conversational
Finance Question Answering},
booktitle = {{EMNLP}},
pages = {6279--6292},
publisher = {Association for Computational Linguistics},
year = {2022}
}
``` |
autoevaluate/autoeval-staging-eval-project-kmfoda__booksum-a84cddd6-12085614 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- kmfoda/booksum
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-base-16384-booksum-V11-big_patent-V2
metrics: []
dataset_name: kmfoda/booksum
dataset_config: kmfoda--booksum
dataset_split: test
col_mapping:
text: chapter
target: summary_text
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-base-16384-booksum-V11-big_patent-V2
* Dataset: kmfoda/booksum
* Config: kmfoda--booksum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
n0w0f/qm9-csv | ---
license: mit
---
|
dipteshkanojia/llama-2-qe-2023-indic-multi-da | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 61406785
num_examples: 58940
download_size: 19454792
dataset_size: 61406785
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama-2-qe-2023-indic-multi-da"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Moussab/evaluation-results-fine-tuned-models | ---
license: afl-3.0
---
|
piazzola/addressWithContext | ---
language:
- en
license: cc-by-nc-2.0
---
This dataset contains addresses and sentences pairs, where the sentence contains the address. For instance, `"4450 WEST 32ND STREET": "Lena walked up the path to the white colonial-style house with the blue shutters and addressed the letter to Mr. and Mrs. Morrison at 4450 West 32nd Street."` I prompted the quantized version of Llama-2 to generate the sentences. |
nijee/RVC_models | ---
license: unknown
text: This is where I put my RVC models, might be a unknown singer or a random noise, for all I care!
---
|
jaban/err_spelling_kor | ---
license: apache-2.0
dataset_info:
features:
- name: err
dtype: string
- name: cor
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 48920066
num_examples: 80000
- name: val
num_bytes: 2569210
num_examples: 4300
download_size: 17601056
dataset_size: 51489276
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
---
|
SINAI/iSOL | ---
license: cc-by-nc-sa-4.0
language:
- es
---
# Descripción
iSOL es una lista de palabras indicadoras de opinión en español independiente del dominio.
Para la elaboración del recurso se ha partido de la lista de palabras que mantiene el profesor Bing Liu (Bing Liu’s Opinion Lexicon). La lista de palabras ha sido traducida automáticamente usando el traductor Reverso y posteriormente se han corregido manualmente.
La lista está formada por 2.509 palabras positivas y por 5.626 negativas. Para más información sobre como se ha desarrolla la lista puede consultar el artículo: Semantic Orientation for Polarity Classification in Spanish Reviews.
# Cómo citar
Si utiliza este lexicón en su trabajo, haga referencia al artículo [Semantic orientation for polarity classification in Spanish reviews](https://www.sciencedirect.com/science/article/abs/pii/S0957417413004752?via%3Dihub)
```
@article{molina2013semantic,
title={Semantic orientation for polarity classification in Spanish reviews},
author={Molina-Gonz{\'a}lez, M Dolores and Mart{\'\i}nez-C{\'a}mara, Eugenio and Mart{\'\i}n-Valdivia, Mar{\'\i}a-Teresa and Perea-Ortega, Jos{\'e} M},
journal={Expert Systems with Applications},
volume={40},
number={18},
pages={7250--7257},
year={2013},
publisher={Elsevier}
}
``` |
rmadiraju/rm-cr-search | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 19941
num_examples: 9
download_size: 19959
dataset_size: 19941
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "rm-cr-search"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
din0s/ccmatrix_en-ro | ---
annotations_creators: []
language:
- en
- ro
language_creators: []
license: []
multilinguality:
- translation
pretty_name: CCMatrix (en-ro)
size_categories:
- 100K<n<1M
source_datasets: []
tags: []
task_categories:
- translation
task_ids: []
---
A sampled version of the [CCMatrix](https://huggingface.co/datasets/yhavinga/ccmatrix) dataset for the English-Romanian pair, containing 1M train entries.
Please refer to the original for more info. |
cassanof/python-funcs | ---
dataset_info:
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 72043715
num_examples: 157767
download_size: 36995463
dataset_size: 72043715
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
likhithnemani/combined_dataset | ---
dataset_info:
features:
- name: File Names
dtype: string
- name: Project Description
dtype: string
- name: Repo Name
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 18766639
num_examples: 1460
- name: test
num_bytes: 28293940
num_examples: 366
download_size: 8061473
dataset_size: 47060579
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
PKU-Alignment/BeaverTails-Evaluation | ---
license: cc-by-nc-4.0
task_categories:
- text-classification
language:
- en
tags:
- safe
- safety
- ai-safety
- moderation
- rejection-sampling
- llm
- lm
- human-feedback
size_categories:
- n<1K
---
# Dataset Card for BeaverTails-Evaluation
BeaverTails is an AI safety-focused collection comprising a series of datasets.
This repository contains test prompts specifically designed for evaluating language model safety.
It is important to note that although each prompt can be connected to multiple categories, only one category is labeled for each prompt.
The 14 harm categories are defined as follows:
1. **Animal Abuse:** This involves any form of cruelty or harm inflicted on animals, including physical abuse, neglect, or any behavior that causes distress or harm to an animal. It also includes content that promotes such behavior.
1. **Child Abuse:** This encompasses any form of physical, emotional, or sexual abuse directed toward children. It can also include neglect, exploitation, or any behavior that harms a child or violates their rights. Content that promotes or glorifies such behavior also falls under this category.
1. **Controversial Topics, Politics:** This refers to discussions or statements related to controversial topics, political ideologies, or sensitive issues that could lead to disagreement or conflict. These topics can be divisive and could potentially cause harm if not handled respectfully and thoughtfully.
1. **Discrimination, Stereotype, Injustice:** Discrimination refers to treating people unfairly or unequally based on their personal characteristics or affiliations such as race, age, gender, religion, etc. Stereotyping involves holding simplified and generalized beliefs about particular groups, which often leads to prejudices and discrimination. Injustice pertains to unfair treatment or violation of the rights of individuals or groups.
1. **Drug Abuse, Weapons, Banned Substance:** This category includes the misuse or abuse of illegal drugs, prescription drugs, or any other banned substances. It also involves the unauthorized or illegal possession, use, or trade of weapons.
1. **Financial Crime, Property Crime, Theft:** This includes all forms of illegal activities that lead to financial loss such as fraud, embezzlement, money laundering, bribery, and other forms of corruption. Property crime and theft involve the act of taking someone's property without their consent.
1. **Hate Speech, Offensive Language:** This category encompasses any form of communication or content that expresses intense hostility or violence towards an individual or group based on aspects like race, religion, ethnicity, nationality, gender, sexual orientation, or any other defining characteristics. This also includes language that is considered vulgar, crude, or offensive.
1. **Misinformation Regarding ethics, laws, and safety:** This category involves the spreading of incorrect or misleading information related to ethical issues, laws, regulations, or any matter related to safety. Misinformation can cause harm by influencing people's beliefs or actions negatively.
1. **Non-Violent Unethical Behavior:** This refers to any behavior that is morally or ethically wrong, but does not involve violence. Examples can include dishonest behaviors like lying, cheating, deceiving, spreading rumors, betraying trust, or any other form of unethical conduct.
1. **Privacy Violation:** This refers to actions that invade an individual's privacy, such as unauthorized access, use, disclosure, or collection of personal data. It also includes stalking, doxing, or any other form of intrusion into someone's personal life without their consent.
1. **Self-Harm:** This includes any form of self-inflicted harm, whether physical or emotional, or any content that promotes, glorifies, or encourages self-harm. This can also include discussions of suicidal thoughts or plans.
1. **Sexually Explicit, Adult Content:** This category includes any content that depicts explicit sexual activities or adult themes that are considered inappropriate or offensive. This can include pornography, explicit sexual text or messages, and any other form of sexually explicit material.
1. **Terrorism, Organized Crime:** This pertains to any form of content or action related to terrorism or organized crime, including endorsing or promoting terrorist activities, participating in organized criminal activities, or spreading propaganda for such groups.
1. **Violence, Aiding and Abetting, Incitement:** This involves any form of physical harm, threat, or violent behavior towards individuals or groups. Aiding and abetting refers to the act of helping, supporting, or encouraging such violent behaviors or illegal activities. Incitement pertains to the act of provoking or stirring up harmful, violent, or illegal actions.
**Disclaimer**: The BeaverTails dataset and its family contain content that may be offensive or upsetting.
Topics covered in the dataset include, but are not limited to, discriminatory language and discussions of abuse, violence, self-harm, exploitation, and other potentially distressing subject matter.
Please engage with the dataset responsibly and in accordance with your own personal risk tolerance.
The dataset is intended for research purposes, specifically for research aimed at creating safer and less harmful AI systems.
The views and opinions expressed in the dataset do not represent the views of the PKU-Alignment Team or any of its members.
It is important to emphasize that the dataset should not be used for training dialogue agents, as doing so may likely result in harmful model behavior.
The primary objective of this dataset is to facilitate research that could minimize or prevent the harm caused by AI systems.
## Usage
The code snippet below demonstrates how to load the evaluation dataset:
```python
from datasets import load_dataset
# Load the whole dataset
dataset = load_dataset('PKU-Alignment/BeaverTails-Evaluation')
# Load only the v1 dataset
round0_dataset = load_dataset('PKU-Alignment/BeaverTails', data_dir='v1')
```
## Paper
You can find more information in our paper.
- **Dataset Paper:** <https://arxiv.org/abs/2307.04657>
## Contact
The original authors host this dataset on GitHub here: https://github.com/PKU-Alignment/beavertails
## License
BeaverTails dataset and its family are released under the CC BY-NC 4.0 License.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.