datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
tobaba2001/scs_phase2_ts_dataset11 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 7703430
num_examples: 10614
download_size: 172401
dataset_size: 7703430
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
king007/testing | ---
license: afl-3.0
---
|
liuyanchen1015/MULTI_VALUE_mnli_conditional_were_was | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 17076
num_examples: 63
- name: dev_mismatched
num_bytes: 21689
num_examples: 93
- name: test_matched
num_bytes: 13813
num_examples: 51
- name: test_mismatched
num_bytes: 15276
num_examples: 63
- name: train
num_bytes: 641266
num_examples: 2306
download_size: 369606
dataset_size: 709120
---
# Dataset Card for "MULTI_VALUE_mnli_conditional_were_was"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LeonardoTiger/conduit | ---
license: openrail
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-34000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 653432
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/dahlia_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of dahlia (Pokémon)
This is the dataset of dahlia (Pokémon), containing 25 images and their tags.
The core tags of this character are `black_hair, long_hair, dark_skin, breasts, dark-skinned_female, blue_eyes, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 25 | 15.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dahlia_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 25 | 11.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dahlia_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 45 | 20.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dahlia_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 25 | 14.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dahlia_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 45 | 26.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dahlia_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/dahlia_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------|
| 0 | 25 |  |  |  |  |  | 1girl, smile, solo, pants, midriff, navel_piercing, denim, cleavage, crop_top |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | smile | solo | pants | midriff | navel_piercing | denim | cleavage | crop_top |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------|:----------|:-----------------|:--------|:-----------|:-----------|
| 0 | 25 |  |  |  |  |  | X | X | X | X | X | X | X | X | X |
|
Glac1er/Shades | ---
license: unknown
---
|
danjacobellis/audio_har_descript_24kHz | ---
dataset_info:
features:
- name: label
dtype: int64
- name: label_str
dtype: string
- name: participant
dtype: int64
- name: codes
sequence:
sequence:
sequence: int64
splits:
- name: train
num_bytes: 399945573
num_examples: 669
download_size: 62472760
dataset_size: 399945573
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lucasmccabe-lmi/oig_small_chip2_noncode | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 54969398.0
num_examples: 150453
download_size: 34598598
dataset_size: 54969398.0
---
# Dataset Card for "oig_small_chip2_noncode"
From LAION's Open Instruction Generalist (OIG) dataset, we provide a subset whose samples are not code-related. OIG text elements are formatted as dialogue exerpts between a "human" and "bot" agent. The code generation prompt is parsed from the initial "human" agent's statement and the resultant response from the "bot" agent's statement. We then reformat the text/response pairs according to the format of the original Alpaca dataset; that is, instruction/input/output triplets.
The OIG dataset was prepared by LAION, and released under the Apache 2.0 license.
Numbers:
Prompts: 150453
Tokens: 11522004 using the EleutherAI/gpt-neox-20b tokenizer (counting instruction+input+output) |
TrainingDataPro/wagons-images-classification | ---
language:
- en
license: cc-by-nc-nd-4.0
task_categories:
- image-classification
tags:
- code
- finance
dataset_info:
features:
- name: id
dtype: int32
- name: name
dtype: string
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': loaded
'1': unloaded
splits:
- name: train
num_bytes: 4452752
num_examples: 18
download_size: 4344062
dataset_size: 4452752
---
# Wagons Images Classification
The dataset consists of images depicting **loaded and unloaded** wagons. The data are organasied in two folders for loaded and unloaded wagons and assisted with .CSV file containing text classification of the images.
This dataset can be useful for various tasks, such as *image classification, object detection and data-driven analyses related to wagon loading and unloading processes.
The dataset is useful for **rail transport sphere**, it can be utilised for automation the identification and classification of the wagons and further optimization of the processes in the industry.

# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=wagons-images-classification) to discuss your requirements, learn about the price and buy the dataset.
# Content
- **loaded**: includes images of loaded wagons
- **unloaded**: includes images of unloaded wagons
- **.csv file**: contains information about the dataset
### File with the extension .csv
includes the following information for each media file:
- **image_name**: link to access the image,
- **type**: type of the wagon in the image (**loaded/unloaded**)
# Wagon images might be collected and labeled in accordance with your requirements.
## **[TrainingData](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=wagons-images-classification)** provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
cyrilzhang/TinyStories2-ascii-val-1600 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 2453198100
num_examples: 598341
- name: validation
num_bytes: 24690200
num_examples: 6022
download_size: 856137162
dataset_size: 2477888300
---
# Dataset Card for "TinyStories2-ascii-val-1600"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713030011 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 17157
num_examples: 38
download_size: 11676
dataset_size: 17157
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713030011"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_itsliupeng__llama2_7b_zh | ---
pretty_name: Evaluation run of itsliupeng/llama2_7b_zh
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [itsliupeng/llama2_7b_zh](https://huggingface.co/itsliupeng/llama2_7b_zh) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_itsliupeng__llama2_7b_zh_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-15T10:51:37.128756](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__llama2_7b_zh_public/blob/main/results_2023-11-15T10-51-37.128756.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5969511263414031,\n\
\ \"acc_stderr\": 0.0329865461490785,\n \"acc_norm\": 0.6078135521201408,\n\
\ \"acc_norm_stderr\": 0.03376504385445851,\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326912,\n \"mc2\": 0.42858587749612026,\n\
\ \"mc2_stderr\": 0.014059235435250938,\n \"em\": 0.18791946308724833,\n\
\ \"em_stderr\": 0.004000599568072892,\n \"f1\": 0.23667890100671124,\n\
\ \"f1_stderr\": 0.003992615682814011\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.47952218430034127,\n \"acc_stderr\": 0.01459913135303501,\n\
\ \"acc_norm\": 0.5204778156996587,\n \"acc_norm_stderr\": 0.01459913135303501\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5608444532961562,\n\
\ \"acc_stderr\": 0.004952698802275648,\n \"acc_norm\": 0.7487552280422227,\n\
\ \"acc_norm_stderr\": 0.004328425700998689\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.03761070869867479,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.03761070869867479\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.029224526469124792,\n\
\ \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.029224526469124792\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082634,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082634\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.03252909619613197,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.03252909619613197\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.044629175353369355,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.044629175353369355\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137285,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137285\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7064516129032258,\n\
\ \"acc_stderr\": 0.025906087021319295,\n \"acc_norm\": 0.7064516129032258,\n\
\ \"acc_norm_stderr\": 0.025906087021319295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139744,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139744\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5871794871794872,\n \"acc_stderr\": 0.024962683564331796,\n\
\ \"acc_norm\": 0.5871794871794872,\n \"acc_norm_stderr\": 0.024962683564331796\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0303883535518868,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0303883535518868\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8091743119266055,\n \"acc_stderr\": 0.01684767640009109,\n \"\
acc_norm\": 0.8091743119266055,\n \"acc_norm_stderr\": 0.01684767640009109\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7303921568627451,\n \"acc_stderr\": 0.031145570659486782,\n \"\
acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.031145570659486782\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677697,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677697\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.03559039531617342,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.03559039531617342\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489298,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489298\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n\
\ \"acc_stderr\": 0.014987270640946012,\n \"acc_norm\": 0.7726692209450831,\n\
\ \"acc_norm_stderr\": 0.014987270640946012\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.025070713719153176,\n\
\ \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.025070713719153176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3217877094972067,\n\
\ \"acc_stderr\": 0.015624236160792582,\n \"acc_norm\": 0.3217877094972067,\n\
\ \"acc_norm_stderr\": 0.015624236160792582\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.02705797462449438,\n\
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.02705797462449438\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464496,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464496\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.02646248777700187,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.02646248777700187\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.455019556714472,\n\
\ \"acc_stderr\": 0.012718456618701763,\n \"acc_norm\": 0.455019556714472,\n\
\ \"acc_norm_stderr\": 0.012718456618701763\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6160130718954249,\n \"acc_stderr\": 0.01967580813528151,\n \
\ \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.01967580813528151\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533207,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533207\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326912,\n \"mc2\": 0.42858587749612026,\n\
\ \"mc2_stderr\": 0.014059235435250938\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7174427782162589,\n \"acc_stderr\": 0.01265406285097139\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.18791946308724833,\n \
\ \"em_stderr\": 0.004000599568072892,\n \"f1\": 0.23667890100671124,\n\
\ \"f1_stderr\": 0.003992615682814011\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.06444275966641395,\n \"acc_stderr\": 0.006763391728488265\n\
\ }\n}\n```"
repo_url: https://huggingface.co/itsliupeng/llama2_7b_zh
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|arc:challenge|25_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|drop|3_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|gsm8k|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hellaswag|10_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T10-51-37.128756.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-15T10-51-37.128756.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- '**/details_harness|winogrande|5_2023-11-15T10-51-37.128756.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-15T10-51-37.128756.parquet'
- config_name: results
data_files:
- split: 2023_11_15T10_51_37.128756
path:
- results_2023-11-15T10-51-37.128756.parquet
- split: latest
path:
- results_2023-11-15T10-51-37.128756.parquet
---
# Dataset Card for Evaluation run of itsliupeng/llama2_7b_zh
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/itsliupeng/llama2_7b_zh
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [itsliupeng/llama2_7b_zh](https://huggingface.co/itsliupeng/llama2_7b_zh) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_itsliupeng__llama2_7b_zh_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-15T10:51:37.128756](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__llama2_7b_zh_public/blob/main/results_2023-11-15T10-51-37.128756.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5969511263414031,
"acc_stderr": 0.0329865461490785,
"acc_norm": 0.6078135521201408,
"acc_norm_stderr": 0.03376504385445851,
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326912,
"mc2": 0.42858587749612026,
"mc2_stderr": 0.014059235435250938,
"em": 0.18791946308724833,
"em_stderr": 0.004000599568072892,
"f1": 0.23667890100671124,
"f1_stderr": 0.003992615682814011
},
"harness|arc:challenge|25": {
"acc": 0.47952218430034127,
"acc_stderr": 0.01459913135303501,
"acc_norm": 0.5204778156996587,
"acc_norm_stderr": 0.01459913135303501
},
"harness|hellaswag|10": {
"acc": 0.5608444532961562,
"acc_stderr": 0.004952698802275648,
"acc_norm": 0.7487552280422227,
"acc_norm_stderr": 0.004328425700998689
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.03761070869867479,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.03761070869867479
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.029224526469124792,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.029224526469124792
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082634,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082634
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.03252909619613197,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.03252909619613197
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.044629175353369355,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.044629175353369355
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137285,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137285
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7064516129032258,
"acc_stderr": 0.025906087021319295,
"acc_norm": 0.7064516129032258,
"acc_norm_stderr": 0.025906087021319295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.02649905770139744,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.02649905770139744
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5871794871794872,
"acc_stderr": 0.024962683564331796,
"acc_norm": 0.5871794871794872,
"acc_norm_stderr": 0.024962683564331796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.0303883535518868,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.0303883535518868
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8091743119266055,
"acc_stderr": 0.01684767640009109,
"acc_norm": 0.8091743119266055,
"acc_norm_stderr": 0.01684767640009109
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7303921568627451,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.7303921568627451,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677697,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677697
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489298,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489298
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.014987270640946012,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.014987270640946012
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.025070713719153176,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.025070713719153176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3217877094972067,
"acc_stderr": 0.015624236160792582,
"acc_norm": 0.3217877094972067,
"acc_norm_stderr": 0.015624236160792582
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.02705797462449438,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.02705797462449438
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464496,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464496
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.02646248777700187,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.02646248777700187
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.455019556714472,
"acc_stderr": 0.012718456618701763,
"acc_norm": 0.455019556714472,
"acc_norm_stderr": 0.012718456618701763
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.01967580813528151,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.01967580813528151
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533207,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533207
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326912,
"mc2": 0.42858587749612026,
"mc2_stderr": 0.014059235435250938
},
"harness|winogrande|5": {
"acc": 0.7174427782162589,
"acc_stderr": 0.01265406285097139
},
"harness|drop|3": {
"em": 0.18791946308724833,
"em_stderr": 0.004000599568072892,
"f1": 0.23667890100671124,
"f1_stderr": 0.003992615682814011
},
"harness|gsm8k|5": {
"acc": 0.06444275966641395,
"acc_stderr": 0.006763391728488265
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
CVasNLPExperiments/cv-as-nlp-vqa-example | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: train
num_bytes: 585201.0
num_examples: 10
- name: test
num_bytes: 585201.0
num_examples: 10
download_size: 1175242
dataset_size: 1170402.0
---
# Dataset Card for "cv-as-nlp-vqa-example"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BlackSamorez/2ch_b_dialogues | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- ru
license: []
multilinguality:
- monolingual
pretty_name: Dialogues mined from 2ch/b/.
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- conversational
task_ids:
- dialogue-generation
---
# Dataset Card for 2ch_b_dialogues
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://github.com/BlackSamorez/ebanko
- **Repository:** [Needs More Information]
- **Paper:** [Needs More Information]
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Needs More Information]
### Dataset Summary
Russian language dialogues mined from 2ch.hk/b/
### Supported Tasks and Leaderboards
[Needs More Information]
### Languages
Russian
## Dataset Structure
### Data Instances
{
"dialogue": ["Glad to hear!", "Fine, thank you!", "Hi, how are you?"]
}
### Data Fields
- dialogue: list of posts ordered last-to-first
### Data Splits
[Needs More Information]
## Dataset Creation
### Curation Rationale
Fun
### Source Data
#### Initial Data Collection and Normalization
In a thread graph only vertices with single parent were selected. Then non-overlapping threads of dialogues were build.
#### Who are the source language producers?
2ch.hk/b/ users
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
Morally questionable data
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
blacks_samorez
### Licensing Information
[Needs More Information]
### Citation Information
[Needs More Information] |
hoangphu7122002ai/t2sql_en_v1 | ---
dataset_info:
features:
- name: question
dtype: string
- name: context
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 289724169
num_examples: 340785
download_size: 81198356
dataset_size: 289724169
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ura-hcmut/mlqa_MLM-dpo | ---
license: mit
language:
- vi
size_categories:
- 1K<n<10K
configs:
- config_name: default
data_files:
- split: test
path: mlqa_MLM-dpo.json
--- |
nguyenminh871/java_unifiedbug_2_1 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: func
dtype: string
- name: target
dtype:
class_label:
names:
'0': true
'1': false
- name: project
dtype: string
splits:
- name: train
num_bytes: 10257684.819539886
num_examples: 3349
- name: test
num_bytes: 3421270.213026591
num_examples: 1117
- name: validation
num_bytes: 3421270.213026591
num_examples: 1117
download_size: 7298469
dataset_size: 17100225.245593067
---
# Dataset Card for "java_unifiedbug_2_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Francesco/people-in-paintings | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': people-in-paintings
'1': Human
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: people-in-paintings
tags:
- rf100
---
# Dataset Card for people-in-paintings
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/people-in-paintings
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
people-in-paintings
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/people-in-paintings
### Citation Information
```
@misc{ people-in-paintings,
title = { people in paintings Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/people-in-paintings } },
url = { https://universe.roboflow.com/object-detection/people-in-paintings },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
Danielouo/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4307372
num_examples: 1000
download_size: 2283061
dataset_size: 4307372
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MohammedAlsayani/FalconDataset | ---
task_categories:
- text-generation
language:
- ar
- en
pretty_name: flacon
--- |
CyberHarem/yuuki_haru_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yuuki_haru/結城晴 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of yuuki_haru/結城晴 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are `orange_hair, long_hair, bangs, purple_eyes, brown_eyes, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 543.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yuuki_haru_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 338.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yuuki_haru_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1187 | 731.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yuuki_haru_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 496.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yuuki_haru_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1187 | 997.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yuuki_haru_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yuuki_haru_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, blush, navel, open_mouth, small_breasts, looking_at_viewer, side-tie_bikini_bottom, solo_focus, 1boy, hetero, micro_bikini, nipples, penis |
| 1 | 15 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, smile, puffy_short_sleeves, dress, mini_hat, open_mouth, ponytail, simple_background, white_background, bow, wrist_cuffs, frills, apron, shirt, top_hat |
| 2 | 26 |  |  |  |  |  | 1girl, cheerleader, midriff, solo, twintails, blush, hair_ribbon, navel, pleated_skirt, pom_pom_(cheerleading), looking_at_viewer, crop_top, simple_background, white_background, miniskirt, red_skirt, armpits, bike_shorts, holding, open_mouth, breasts, collarbone, shorts_under_skirt, sleeveless_shirt, arm_up |
| 3 | 5 |  |  |  |  |  | 1girl, simple_background, solo, upper_body, white_background, blush, open_mouth, short_sleeves, blue_shirt, brown_hair, looking_at_viewer, signature, sweat, upper_teeth_only |
| 4 | 5 |  |  |  |  |  | 1girl, hoodie, simple_background, solo, upper_body, white_background, looking_at_viewer, blush, closed_mouth, open_mouth |
| 5 | 34 |  |  |  |  |  | 1girl, tomboy, solo, baseball_cap, looking_at_viewer, backwards_hat, simple_background, sneakers, hair_through_headwear, blush, hoodie, white_background, white_shirt, black_shorts, smile, soccer_ball, full_body, open_jacket, socks |
| 6 | 8 |  |  |  |  |  | 1girl, enmaided, maid_apron, maid_headdress, puffy_short_sleeves, solo, blush, frilled_apron, looking_at_viewer, white_apron, wrist_cuffs, neck_ribbon, simple_background, smile, white_thighhighs, black_dress, blue_dress, hair_ribbon, red_ribbon, blue_bow, frilled_dress, holding, medium_hair, sweatdrop, vertical-striped_dress, white_background |
| 7 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, midriff, navel, solo, open_mouth, smile, star_(symbol), bare_shoulders, choker, gloves, hat, necklace, pink_jacket, skirt |
| 8 | 10 |  |  |  |  |  | 1girl, midriff, solo, black_gloves, fingerless_gloves, looking_at_viewer, navel, open_mouth, white_shorts, headphones, short_shorts, smile, blush, crop_top, headset, shoes, socks, suspender_shorts, belt, blue_footwear, one_eye_closed, choker, full_body, hood_down, hooded_jacket, open_jacket, star_(symbol), ;d, blue_jacket, buckle, collarbone, hairband, medium_hair, sleeveless_jacket |
| 9 | 13 |  |  |  |  |  | 1girl, fake_animal_ears, rabbit_ears, blush, looking_at_viewer, simple_background, playboy_bunny, solo, small_breasts, white_background, bare_shoulders, bowtie, wrist_cuffs, black_leotard, detached_collar, open_mouth, rabbit_tail, pantyhose, brown_hair, full_body, high_heels, strapless_leotard |
| 10 | 5 |  |  |  |  |  | 1girl, navel, solo, blush, small_breasts, bare_shoulders, collarbone, cowboy_shot, panties, shorts, simple_background, sports_bra, white_background, bare_arms, closed_mouth, looking_at_viewer, stomach, underwear_only |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | navel | open_mouth | small_breasts | looking_at_viewer | side-tie_bikini_bottom | solo_focus | 1boy | hetero | micro_bikini | nipples | penis | solo | smile | puffy_short_sleeves | dress | mini_hat | ponytail | simple_background | white_background | bow | wrist_cuffs | frills | apron | shirt | top_hat | cheerleader | midriff | twintails | hair_ribbon | pleated_skirt | pom_pom_(cheerleading) | crop_top | miniskirt | red_skirt | armpits | bike_shorts | holding | breasts | collarbone | shorts_under_skirt | sleeveless_shirt | arm_up | upper_body | short_sleeves | blue_shirt | brown_hair | signature | sweat | upper_teeth_only | hoodie | closed_mouth | tomboy | baseball_cap | backwards_hat | sneakers | hair_through_headwear | white_shirt | black_shorts | soccer_ball | full_body | open_jacket | socks | enmaided | maid_apron | maid_headdress | frilled_apron | white_apron | neck_ribbon | white_thighhighs | black_dress | blue_dress | red_ribbon | blue_bow | frilled_dress | medium_hair | sweatdrop | vertical-striped_dress | star_(symbol) | bare_shoulders | choker | gloves | hat | necklace | pink_jacket | skirt | black_gloves | fingerless_gloves | white_shorts | headphones | short_shorts | headset | shoes | suspender_shorts | belt | blue_footwear | one_eye_closed | hood_down | hooded_jacket | ;d | blue_jacket | buckle | hairband | sleeveless_jacket | fake_animal_ears | rabbit_ears | playboy_bunny | bowtie | black_leotard | detached_collar | rabbit_tail | pantyhose | high_heels | strapless_leotard | cowboy_shot | panties | shorts | sports_bra | bare_arms | stomach | underwear_only |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:--------|:-------------|:----------------|:--------------------|:-------------------------|:-------------|:-------|:---------|:---------------|:----------|:--------|:-------|:--------|:----------------------|:--------|:-----------|:-----------|:--------------------|:-------------------|:------|:--------------|:---------|:--------|:--------|:----------|:--------------|:----------|:------------|:--------------|:----------------|:-------------------------|:-----------|:------------|:------------|:----------|:--------------|:----------|:----------|:-------------|:---------------------|:-------------------|:---------|:-------------|:----------------|:-------------|:-------------|:------------|:--------|:-------------------|:---------|:---------------|:---------|:---------------|:----------------|:-----------|:------------------------|:--------------|:---------------|:--------------|:------------|:--------------|:--------|:-----------|:-------------|:-----------------|:----------------|:--------------|:--------------|:-------------------|:--------------|:-------------|:-------------|:-----------|:----------------|:--------------|:------------|:-------------------------|:----------------|:-----------------|:---------|:---------|:------|:-----------|:--------------|:--------|:---------------|:--------------------|:---------------|:-------------|:---------------|:----------|:--------|:-------------------|:-------|:----------------|:-----------------|:------------|:----------------|:-----|:--------------|:---------|:-----------|:--------------------|:-------------------|:--------------|:----------------|:---------|:----------------|:------------------|:--------------|:------------|:-------------|:--------------------|:--------------|:----------|:---------|:-------------|:------------|:----------|:-----------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 15 |  |  |  |  |  | X | X | | X | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 26 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | X | | | | | | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | X | | X | | | | | | | | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | | X | | X | | | | | | | | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 34 |  |  |  |  |  | X | X | | | | X | | | | | | | | X | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 8 |  |  |  |  |  | X | X | | | | X | | | | | | | | X | X | X | | | | X | X | | X | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | X | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 10 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | X | X | | | | | | | | | | | | | | X | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | X | | | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 9 | 13 |  |  |  |  |  | X | X | | X | X | X | | | | | | | | X | | | | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 10 | 5 |  |  |  |  |  | X | X | X | | X | X | | | | | | | | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X |
|
SkillstechAI/dataset-LMO-chatgpt | ---
license: apache-2.0
---
|
Captluke/llama2-wiki-v3 | ---
language:
- en
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AllyAkin/LuedjiLunavocals | ---
license: openrail
---
|
open-llm-leaderboard/details_dddsaty__SOLAR_Merge_Adapter_DPO_Orca | ---
pretty_name: Evaluation run of dddsaty/SOLAR_Merge_Adapter_DPO_Orca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dddsaty/SOLAR_Merge_Adapter_DPO_Orca](https://huggingface.co/dddsaty/SOLAR_Merge_Adapter_DPO_Orca)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dddsaty__SOLAR_Merge_Adapter_DPO_Orca\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-05T08:48:15.938281](https://huggingface.co/datasets/open-llm-leaderboard/details_dddsaty__SOLAR_Merge_Adapter_DPO_Orca/blob/main/results_2024-02-05T08-48-15.938281.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6327439375185642,\n\
\ \"acc_stderr\": 0.032443250608216324,\n \"acc_norm\": 0.6355200172658205,\n\
\ \"acc_norm_stderr\": 0.03310341022715256,\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.51488245253393,\n\
\ \"mc2_stderr\": 0.015188854393420268\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6109215017064846,\n \"acc_stderr\": 0.014247309976045607,\n\
\ \"acc_norm\": 0.6390784982935154,\n \"acc_norm_stderr\": 0.014034761386175452\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6499701254730134,\n\
\ \"acc_stderr\": 0.004760041843651493,\n \"acc_norm\": 0.8458474407488548,\n\
\ \"acc_norm_stderr\": 0.0036035695286784114\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n\
\ \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.03745554791462456,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.03745554791462456\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099522,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099522\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04082482904638628,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04082482904638628\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n\
\ \"acc_stderr\": 0.025331202438944423,\n \"acc_norm\": 0.41005291005291006,\n\
\ \"acc_norm_stderr\": 0.025331202438944423\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n\
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.7322580645161291,\n \"acc_stderr\": 0.02518900666021238,\n\
\ \"acc_norm\": 0.7322580645161291,\n \"acc_norm_stderr\": 0.02518900666021238\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.43349753694581283,\n \"acc_stderr\": 0.03486731727419872,\n \"\
acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.03486731727419872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8282828282828283,\n \"acc_stderr\": 0.0268697161874299,\n \"acc_norm\"\
: 0.8282828282828283,\n \"acc_norm_stderr\": 0.0268697161874299\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593566,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593566\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.02475600038213095,\n \
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.02475600038213095\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476664,\n \
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476664\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135353,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135353\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659809,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659809\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8165137614678899,\n \"acc_stderr\": 0.016595259710399293,\n \"\
acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.016595259710399293\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.033812000056435254,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.033812000056435254\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8725490196078431,\n \"acc_stderr\": 0.02340553048084631,\n \"\
acc_norm\": 0.8725490196078431,\n \"acc_norm_stderr\": 0.02340553048084631\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.030500283176545847,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.030500283176545847\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579828,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579828\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069716,\n\
\ \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069716\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35195530726256985,\n\
\ \"acc_stderr\": 0.01597266852368907,\n \"acc_norm\": 0.35195530726256985,\n\
\ \"acc_norm_stderr\": 0.01597266852368907\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.02456922360046085,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.02456922360046085\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829714,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829714\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6437908496732027,\n \"acc_stderr\": 0.0193733324207245,\n \
\ \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.0193733324207245\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36107711138310894,\n\
\ \"mc1_stderr\": 0.016814312844836882,\n \"mc2\": 0.51488245253393,\n\
\ \"mc2_stderr\": 0.015188854393420268\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8200473559589582,\n \"acc_stderr\": 0.01079646868806868\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5056861258529188,\n \
\ \"acc_stderr\": 0.013771594106283036\n }\n}\n```"
repo_url: https://huggingface.co/dddsaty/SOLAR_Merge_Adapter_DPO_Orca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|arc:challenge|25_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|gsm8k|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hellaswag|10_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T08-48-15.938281.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-05T08-48-15.938281.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- '**/details_harness|winogrande|5_2024-02-05T08-48-15.938281.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-05T08-48-15.938281.parquet'
- config_name: results
data_files:
- split: 2024_02_05T08_48_15.938281
path:
- results_2024-02-05T08-48-15.938281.parquet
- split: latest
path:
- results_2024-02-05T08-48-15.938281.parquet
---
# Dataset Card for Evaluation run of dddsaty/SOLAR_Merge_Adapter_DPO_Orca
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [dddsaty/SOLAR_Merge_Adapter_DPO_Orca](https://huggingface.co/dddsaty/SOLAR_Merge_Adapter_DPO_Orca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dddsaty__SOLAR_Merge_Adapter_DPO_Orca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-05T08:48:15.938281](https://huggingface.co/datasets/open-llm-leaderboard/details_dddsaty__SOLAR_Merge_Adapter_DPO_Orca/blob/main/results_2024-02-05T08-48-15.938281.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6327439375185642,
"acc_stderr": 0.032443250608216324,
"acc_norm": 0.6355200172658205,
"acc_norm_stderr": 0.03310341022715256,
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836882,
"mc2": 0.51488245253393,
"mc2_stderr": 0.015188854393420268
},
"harness|arc:challenge|25": {
"acc": 0.6109215017064846,
"acc_stderr": 0.014247309976045607,
"acc_norm": 0.6390784982935154,
"acc_norm_stderr": 0.014034761386175452
},
"harness|hellaswag|10": {
"acc": 0.6499701254730134,
"acc_stderr": 0.004760041843651493,
"acc_norm": 0.8458474407488548,
"acc_norm_stderr": 0.0036035695286784114
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03745554791462456,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03745554791462456
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099522,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099522
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.04082482904638628,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04082482904638628
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944423,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944423
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.02518900666021238,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.02518900666021238
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.03486731727419872,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.03486731727419872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8282828282828283,
"acc_stderr": 0.0268697161874299,
"acc_norm": 0.8282828282828283,
"acc_norm_stderr": 0.0268697161874299
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593566,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593566
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.02475600038213095,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.02475600038213095
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476664,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135353,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135353
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659809,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659809
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.016595259710399293,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.016595259710399293
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8725490196078431,
"acc_stderr": 0.02340553048084631,
"acc_norm": 0.8725490196078431,
"acc_norm_stderr": 0.02340553048084631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.030500283176545847,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.030500283176545847
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579828,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579828
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069716,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35195530726256985,
"acc_stderr": 0.01597266852368907,
"acc_norm": 0.35195530726256985,
"acc_norm_stderr": 0.01597266852368907
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.02456922360046085,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.02456922360046085
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829714,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829714
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.0193733324207245,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.0193733324207245
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36107711138310894,
"mc1_stderr": 0.016814312844836882,
"mc2": 0.51488245253393,
"mc2_stderr": 0.015188854393420268
},
"harness|winogrande|5": {
"acc": 0.8200473559589582,
"acc_stderr": 0.01079646868806868
},
"harness|gsm8k|5": {
"acc": 0.5056861258529188,
"acc_stderr": 0.013771594106283036
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
reprography/output | ---
license: openrail++
---
|
yjernite/prof_images_blip__dalle-2 | ---
dataset_info:
features:
- name: images
dtype: image
- name: embeddings
sequence: float32
splits:
- name: paralegal
num_bytes: 6171541.0
num_examples: 210
- name: bartender
num_bytes: 7833899.0
num_examples: 210
- name: facilities_manager
num_bytes: 5788116.0
num_examples: 210
- name: accountant
num_bytes: 6026020.0
num_examples: 210
- name: graphic_designer
num_bytes: 6387627.0
num_examples: 210
- name: network_administrator
num_bytes: 7762970.0
num_examples: 210
- name: financial_manager
num_bytes: 5530632.0
num_examples: 210
- name: baker
num_bytes: 6316094.0
num_examples: 210
- name: security_guard
num_bytes: 5795899.0
num_examples: 210
- name: artist
num_bytes: 6803991.0
num_examples: 210
- name: author
num_bytes: 6915505.0
num_examples: 210
- name: printing_press_operator
num_bytes: 8473037.0
num_examples: 210
- name: public_relations_specialist
num_bytes: 5380182.0
num_examples: 210
- name: sheet_metal_worker
num_bytes: 7975210.0
num_examples: 210
- name: clergy
num_bytes: 5223476.0
num_examples: 210
- name: payroll_clerk
num_bytes: 6773901.0
num_examples: 210
- name: teller
num_bytes: 6267512.0
num_examples: 210
- name: real_estate_broker
num_bytes: 6181930.0
num_examples: 210
- name: customer_service_representative
num_bytes: 6255124.0
num_examples: 210
- name: painter
num_bytes: 7059566.0
num_examples: 210
- name: tractor_operator
num_bytes: 8205340.0
num_examples: 210
- name: dental_hygienist
num_bytes: 6308110.0
num_examples: 210
- name: industrial_engineer
num_bytes: 6314334.0
num_examples: 210
- name: electrician
num_bytes: 6819717.0
num_examples: 210
- name: head_cook
num_bytes: 5257860.0
num_examples: 210
- name: health_technician
num_bytes: 5593594.0
num_examples: 210
- name: carpet_installer
num_bytes: 7547954.0
num_examples: 210
- name: purchasing_agent
num_bytes: 5512896.0
num_examples: 210
- name: supervisor
num_bytes: 5654546.0
num_examples: 210
- name: civil_engineer
num_bytes: 5988844.0
num_examples: 210
- name: lawyer
num_bytes: 5621864.0
num_examples: 210
- name: language_pathologist
num_bytes: 6758300.0
num_examples: 210
- name: ceo
num_bytes: 5519589.0
num_examples: 210
- name: computer_support_specialist
num_bytes: 5938747.0
num_examples: 210
- name: postal_worker
num_bytes: 6050033.0
num_examples: 210
- name: mechanical_engineer
num_bytes: 6881390.0
num_examples: 210
- name: nursing_assistant
num_bytes: 5394085.0
num_examples: 210
- name: dentist
num_bytes: 5936658.0
num_examples: 210
- name: tutor
num_bytes: 6677088.0
num_examples: 210
- name: butcher
num_bytes: 6984230.0
num_examples: 210
- name: insurance_agent
num_bytes: 5639693.0
num_examples: 210
- name: courier
num_bytes: 5364578.0
num_examples: 210
- name: computer_programmer
num_bytes: 6987489.0
num_examples: 210
- name: truck_driver
num_bytes: 7359790.0
num_examples: 210
- name: mechanic
num_bytes: 7121417.0
num_examples: 210
- name: marketing_manager
num_bytes: 5507124.0
num_examples: 210
- name: sales_manager
num_bytes: 5342664.0
num_examples: 210
- name: correctional_officer
num_bytes: 5573956.0
num_examples: 210
- name: manager
num_bytes: 5396427.0
num_examples: 210
- name: underwriter
num_bytes: 6040312.0
num_examples: 210
- name: executive_assistant
num_bytes: 5352534.0
num_examples: 210
- name: designer
num_bytes: 6566770.0
num_examples: 210
- name: groundskeeper
num_bytes: 8084303.0
num_examples: 210
- name: mental_health_counselor
num_bytes: 6916972.0
num_examples: 210
- name: aerospace_engineer
num_bytes: 6479161.0
num_examples: 210
- name: taxi_driver
num_bytes: 7017648.0
num_examples: 210
- name: nurse
num_bytes: 5495626.0
num_examples: 210
- name: data_entry_keyer
num_bytes: 6795816.0
num_examples: 210
- name: musician
num_bytes: 6697000.0
num_examples: 210
- name: event_planner
num_bytes: 6572383.0
num_examples: 210
- name: writer
num_bytes: 7314011.0
num_examples: 210
- name: cook
num_bytes: 5418827.0
num_examples: 210
- name: welder
num_bytes: 7087349.0
num_examples: 210
- name: producer
num_bytes: 7145632.0
num_examples: 210
- name: hairdresser
num_bytes: 6623867.0
num_examples: 210
- name: farmer
num_bytes: 8185317.0
num_examples: 210
- name: construction_worker
num_bytes: 6468388.0
num_examples: 210
- name: air_conditioning_installer
num_bytes: 7370683.0
num_examples: 210
- name: electrical_engineer
num_bytes: 6715574.0
num_examples: 210
- name: occupational_therapist
num_bytes: 5604385.0
num_examples: 210
- name: career_counselor
num_bytes: 5745900.0
num_examples: 210
- name: interior_designer
num_bytes: 7494072.0
num_examples: 210
- name: jailer
num_bytes: 7390180.0
num_examples: 210
- name: office_clerk
num_bytes: 6158754.0
num_examples: 210
- name: market_research_analyst
num_bytes: 6426759.0
num_examples: 210
- name: laboratory_technician
num_bytes: 6085724.0
num_examples: 210
- name: social_assistant
num_bytes: 5740255.0
num_examples: 210
- name: medical_records_specialist
num_bytes: 5676198.0
num_examples: 210
- name: machinery_mechanic
num_bytes: 7906020.0
num_examples: 210
- name: police_officer
num_bytes: 5324492.0
num_examples: 210
- name: software_developer
num_bytes: 6276609.0
num_examples: 210
- name: clerk
num_bytes: 5676407.0
num_examples: 210
- name: salesperson
num_bytes: 5914770.0
num_examples: 210
- name: social_worker
num_bytes: 6347119.0
num_examples: 210
- name: director
num_bytes: 6290656.0
num_examples: 210
- name: fast_food_worker
num_bytes: 6558433.0
num_examples: 210
- name: singer
num_bytes: 6636106.0
num_examples: 210
- name: metal_worker
num_bytes: 7731701.0
num_examples: 210
- name: cleaner
num_bytes: 5941280.0
num_examples: 210
- name: computer_systems_analyst
num_bytes: 7377078.0
num_examples: 210
- name: dental_assistant
num_bytes: 5946076.0
num_examples: 210
- name: psychologist
num_bytes: 6546652.0
num_examples: 210
- name: machinist
num_bytes: 7787841.0
num_examples: 210
- name: therapist
num_bytes: 5711184.0
num_examples: 210
- name: veterinarian
num_bytes: 5784277.0
num_examples: 210
- name: teacher
num_bytes: 6482004.0
num_examples: 210
- name: architect
num_bytes: 5692410.0
num_examples: 210
- name: office_worker
num_bytes: 5996521.0
num_examples: 210
- name: drywall_installer
num_bytes: 6688982.0
num_examples: 210
- name: nutritionist
num_bytes: 6426587.0
num_examples: 210
- name: librarian
num_bytes: 8678483.0
num_examples: 210
- name: childcare_worker
num_bytes: 6822808.0
num_examples: 210
- name: school_bus_driver
num_bytes: 8018152.0
num_examples: 210
- name: file_clerk
num_bytes: 6955599.0
num_examples: 210
- name: logistician
num_bytes: 5926532.0
num_examples: 210
- name: scientist
num_bytes: 5943205.0
num_examples: 210
- name: teaching_assistant
num_bytes: 5865061.0
num_examples: 210
- name: radiologic_technician
num_bytes: 6388216.0
num_examples: 210
- name: manicurist
num_bytes: 6873651.0
num_examples: 210
- name: community_manager
num_bytes: 6385984.0
num_examples: 210
- name: carpenter
num_bytes: 7638421.0
num_examples: 210
- name: claims_appraiser
num_bytes: 6312794.0
num_examples: 210
- name: dispatcher
num_bytes: 6274437.0
num_examples: 210
- name: cashier
num_bytes: 7570675.0
num_examples: 210
- name: roofer
num_bytes: 7478357.0
num_examples: 210
- name: photographer
num_bytes: 6540455.0
num_examples: 210
- name: detective
num_bytes: 5955413.0
num_examples: 210
- name: financial_advisor
num_bytes: 5829031.0
num_examples: 210
- name: wholesale_buyer
num_bytes: 7808137.0
num_examples: 210
- name: it_specialist
num_bytes: 5992213.0
num_examples: 210
- name: pharmacy_technician
num_bytes: 7059008.0
num_examples: 210
- name: engineer
num_bytes: 5572540.0
num_examples: 210
- name: mover
num_bytes: 5590911.0
num_examples: 210
- name: plane_mechanic
num_bytes: 7906358.0
num_examples: 210
- name: interviewer
num_bytes: 5934702.0
num_examples: 210
- name: massage_therapist
num_bytes: 6023245.0
num_examples: 210
- name: dishwasher
num_bytes: 8125724.0
num_examples: 210
- name: fitness_instructor
num_bytes: 4981418.0
num_examples: 210
- name: credit_counselor
num_bytes: 5738880.0
num_examples: 210
- name: stocker
num_bytes: 6666621.0
num_examples: 210
- name: pharmacist
num_bytes: 6738866.0
num_examples: 210
- name: doctor
num_bytes: 5611600.0
num_examples: 210
- name: compliance_officer
num_bytes: 5511904.0
num_examples: 210
- name: aide
num_bytes: 5725510.0
num_examples: 210
- name: bus_driver
num_bytes: 7206303.0
num_examples: 210
- name: financial_analyst
num_bytes: 6012222.0
num_examples: 210
- name: receptionist
num_bytes: 6144590.0
num_examples: 210
- name: janitor
num_bytes: 6307371.0
num_examples: 210
- name: plumber
num_bytes: 6220333.0
num_examples: 210
- name: physical_therapist
num_bytes: 5152228.0
num_examples: 210
- name: inventory_clerk
num_bytes: 7740513.0
num_examples: 210
- name: firefighter
num_bytes: 6957703.0
num_examples: 210
- name: coach
num_bytes: 5582613.0
num_examples: 210
- name: maid
num_bytes: 5835512.0
num_examples: 210
- name: pilot
num_bytes: 6510679.0
num_examples: 210
- name: repair_worker
num_bytes: 6293370.0
num_examples: 210
download_size: 992148797
dataset_size: 940302502.0
---
# Dataset Card for "prof_images_blip__dalle-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_xriminact__TarsChattyBasev0.2 | ---
pretty_name: Evaluation run of xriminact/TarsChattyBasev0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xriminact/TarsChattyBasev0.2](https://huggingface.co/xriminact/TarsChattyBasev0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xriminact__TarsChattyBasev0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-16T14:41:25.883812](https://huggingface.co/datasets/open-llm-leaderboard/details_xriminact__TarsChattyBasev0.2/blob/main/results_2024-01-16T14-41-25.883812.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48620348001542985,\n\
\ \"acc_stderr\": 0.0348281020538984,\n \"acc_norm\": 0.48570430699642225,\n\
\ \"acc_norm_stderr\": 0.03554381019405187,\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184824,\n \"mc2\": 0.4378655189377286,\n\
\ \"mc2_stderr\": 0.01517288496510812\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4761092150170648,\n \"acc_stderr\": 0.014594701798071654,\n\
\ \"acc_norm\": 0.5221843003412969,\n \"acc_norm_stderr\": 0.014597001927076135\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5952001593308106,\n\
\ \"acc_stderr\": 0.004898501014225837,\n \"acc_norm\": 0.7778331009759012,\n\
\ \"acc_norm_stderr\": 0.004148531608981493\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5169811320754717,\n \"acc_stderr\": 0.030755120364119905,\n\
\ \"acc_norm\": 0.5169811320754717,\n \"acc_norm_stderr\": 0.030755120364119905\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4791666666666667,\n\
\ \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.4791666666666667,\n\
\ \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \"acc_norm\": 0.35,\n\
\ \"acc_norm_stderr\": 0.04793724854411018\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.44508670520231214,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.44508670520231214,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.02467786284133278,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.02467786284133278\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743743,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743743\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5709677419354838,\n \"acc_stderr\": 0.028156036538233193,\n \"\
acc_norm\": 0.5709677419354838,\n \"acc_norm_stderr\": 0.028156036538233193\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.39408866995073893,\n \"acc_stderr\": 0.034381579670365446,\n \"\
acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.034381579670365446\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.503030303030303,\n \"acc_stderr\": 0.039042723414318574,\n\
\ \"acc_norm\": 0.503030303030303,\n \"acc_norm_stderr\": 0.039042723414318574\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6111111111111112,\n \"acc_stderr\": 0.0347327959083696,\n \"acc_norm\"\
: 0.6111111111111112,\n \"acc_norm_stderr\": 0.0347327959083696\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.6528497409326425,\n \"acc_stderr\": 0.03435696168361355,\n\
\ \"acc_norm\": 0.6528497409326425,\n \"acc_norm_stderr\": 0.03435696168361355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.48205128205128206,\n \"acc_stderr\": 0.02533466708095495,\n\
\ \"acc_norm\": 0.48205128205128206,\n \"acc_norm_stderr\": 0.02533466708095495\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524572,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524572\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4831932773109244,\n \"acc_stderr\": 0.03246013680375308,\n \
\ \"acc_norm\": 0.4831932773109244,\n \"acc_norm_stderr\": 0.03246013680375308\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.037579499229433426,\n \"\
acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.037579499229433426\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6091743119266055,\n \"acc_stderr\": 0.020920058346111055,\n \"\
acc_norm\": 0.6091743119266055,\n \"acc_norm_stderr\": 0.020920058346111055\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.36574074074074076,\n \"acc_stderr\": 0.03284738857647207,\n \"\
acc_norm\": 0.36574074074074076,\n \"acc_norm_stderr\": 0.03284738857647207\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5784313725490197,\n \"acc_stderr\": 0.034658681963807614,\n \"\
acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.034658681963807614\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5907172995780591,\n \"acc_stderr\": 0.032007041833595914,\n \
\ \"acc_norm\": 0.5907172995780591,\n \"acc_norm_stderr\": 0.032007041833595914\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4977578475336323,\n\
\ \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.4977578475336323,\n\
\ \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.043171711948702556,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.043171711948702556\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968431,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968431\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.46296296296296297,\n\
\ \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.46296296296296297,\n\
\ \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6073619631901841,\n \"acc_stderr\": 0.03836740907831029,\n\
\ \"acc_norm\": 0.6073619631901841,\n \"acc_norm_stderr\": 0.03836740907831029\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.045821241601615506,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.045821241601615506\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6794871794871795,\n\
\ \"acc_stderr\": 0.030572811310299604,\n \"acc_norm\": 0.6794871794871795,\n\
\ \"acc_norm_stderr\": 0.030572811310299604\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6053639846743295,\n\
\ \"acc_stderr\": 0.017478464305911542,\n \"acc_norm\": 0.6053639846743295,\n\
\ \"acc_norm_stderr\": 0.017478464305911542\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.026919095102908273,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.026919095102908273\n \
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33743016759776534,\n\
\ \"acc_stderr\": 0.01581390128391305,\n \"acc_norm\": 0.33743016759776534,\n\
\ \"acc_norm_stderr\": 0.01581390128391305\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5065359477124183,\n \"acc_stderr\": 0.028627470550556047,\n\
\ \"acc_norm\": 0.5065359477124183,\n \"acc_norm_stderr\": 0.028627470550556047\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5209003215434084,\n\
\ \"acc_stderr\": 0.028373270961069414,\n \"acc_norm\": 0.5209003215434084,\n\
\ \"acc_norm_stderr\": 0.028373270961069414\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5061728395061729,\n \"acc_stderr\": 0.027818623962583295,\n\
\ \"acc_norm\": 0.5061728395061729,\n \"acc_norm_stderr\": 0.027818623962583295\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347243,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347243\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35267275097783574,\n\
\ \"acc_stderr\": 0.012203286846053887,\n \"acc_norm\": 0.35267275097783574,\n\
\ \"acc_norm_stderr\": 0.012203286846053887\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.030161911930767102,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.030161911930767102\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4738562091503268,\n \"acc_stderr\": 0.020200164564804588,\n \
\ \"acc_norm\": 0.4738562091503268,\n \"acc_norm_stderr\": 0.020200164564804588\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4636363636363636,\n\
\ \"acc_stderr\": 0.047764491623961985,\n \"acc_norm\": 0.4636363636363636,\n\
\ \"acc_norm_stderr\": 0.047764491623961985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.03198761546763127,\n\
\ \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.03198761546763127\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\
\ \"acc_stderr\": 0.03235743789355042,\n \"acc_norm\": 0.7014925373134329,\n\
\ \"acc_norm_stderr\": 0.03235743789355042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6491228070175439,\n \"acc_stderr\": 0.03660298834049163,\n\
\ \"acc_norm\": 0.6491228070175439,\n \"acc_norm_stderr\": 0.03660298834049163\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184824,\n \"mc2\": 0.4378655189377286,\n\
\ \"mc2_stderr\": 0.01517288496510812\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6945540647198106,\n \"acc_stderr\": 0.012945038632552032\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5360121304018196,\n \
\ \"acc_stderr\": 0.013736715929950315\n }\n}\n```"
repo_url: https://huggingface.co/xriminact/TarsChattyBasev0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|arc:challenge|25_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|gsm8k|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hellaswag|10_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T14-41-25.883812.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T14-41-25.883812.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- '**/details_harness|winogrande|5_2024-01-16T14-41-25.883812.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-16T14-41-25.883812.parquet'
- config_name: results
data_files:
- split: 2024_01_16T14_41_25.883812
path:
- results_2024-01-16T14-41-25.883812.parquet
- split: latest
path:
- results_2024-01-16T14-41-25.883812.parquet
---
# Dataset Card for Evaluation run of xriminact/TarsChattyBasev0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [xriminact/TarsChattyBasev0.2](https://huggingface.co/xriminact/TarsChattyBasev0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xriminact__TarsChattyBasev0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T14:41:25.883812](https://huggingface.co/datasets/open-llm-leaderboard/details_xriminact__TarsChattyBasev0.2/blob/main/results_2024-01-16T14-41-25.883812.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48620348001542985,
"acc_stderr": 0.0348281020538984,
"acc_norm": 0.48570430699642225,
"acc_norm_stderr": 0.03554381019405187,
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184824,
"mc2": 0.4378655189377286,
"mc2_stderr": 0.01517288496510812
},
"harness|arc:challenge|25": {
"acc": 0.4761092150170648,
"acc_stderr": 0.014594701798071654,
"acc_norm": 0.5221843003412969,
"acc_norm_stderr": 0.014597001927076135
},
"harness|hellaswag|10": {
"acc": 0.5952001593308106,
"acc_stderr": 0.004898501014225837,
"acc_norm": 0.7778331009759012,
"acc_norm_stderr": 0.004148531608981493
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.46710526315789475,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.46710526315789475,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5169811320754717,
"acc_stderr": 0.030755120364119905,
"acc_norm": 0.5169811320754717,
"acc_norm_stderr": 0.030755120364119905
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4791666666666667,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.4791666666666667,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.02467786284133278,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.02467786284133278
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743743,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743743
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5709677419354838,
"acc_stderr": 0.028156036538233193,
"acc_norm": 0.5709677419354838,
"acc_norm_stderr": 0.028156036538233193
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.034381579670365446,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.034381579670365446
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.503030303030303,
"acc_stderr": 0.039042723414318574,
"acc_norm": 0.503030303030303,
"acc_norm_stderr": 0.039042723414318574
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.0347327959083696,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.0347327959083696
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6528497409326425,
"acc_stderr": 0.03435696168361355,
"acc_norm": 0.6528497409326425,
"acc_norm_stderr": 0.03435696168361355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48205128205128206,
"acc_stderr": 0.02533466708095495,
"acc_norm": 0.48205128205128206,
"acc_norm_stderr": 0.02533466708095495
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524572,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524572
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4831932773109244,
"acc_stderr": 0.03246013680375308,
"acc_norm": 0.4831932773109244,
"acc_norm_stderr": 0.03246013680375308
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.037579499229433426,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.037579499229433426
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6091743119266055,
"acc_stderr": 0.020920058346111055,
"acc_norm": 0.6091743119266055,
"acc_norm_stderr": 0.020920058346111055
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.36574074074074076,
"acc_stderr": 0.03284738857647207,
"acc_norm": 0.36574074074074076,
"acc_norm_stderr": 0.03284738857647207
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.034658681963807614,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.034658681963807614
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5907172995780591,
"acc_stderr": 0.032007041833595914,
"acc_norm": 0.5907172995780591,
"acc_norm_stderr": 0.032007041833595914
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4977578475336323,
"acc_stderr": 0.033557465352232634,
"acc_norm": 0.4977578475336323,
"acc_norm_stderr": 0.033557465352232634
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.043171711948702556,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.043171711948702556
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968431,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968431
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.04820403072760627,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.04820403072760627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6073619631901841,
"acc_stderr": 0.03836740907831029,
"acc_norm": 0.6073619631901841,
"acc_norm_stderr": 0.03836740907831029
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.045821241601615506,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.045821241601615506
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.030572811310299604,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.030572811310299604
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6053639846743295,
"acc_stderr": 0.017478464305911542,
"acc_norm": 0.6053639846743295,
"acc_norm_stderr": 0.017478464305911542
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5,
"acc_stderr": 0.026919095102908273,
"acc_norm": 0.5,
"acc_norm_stderr": 0.026919095102908273
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33743016759776534,
"acc_stderr": 0.01581390128391305,
"acc_norm": 0.33743016759776534,
"acc_norm_stderr": 0.01581390128391305
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5065359477124183,
"acc_stderr": 0.028627470550556047,
"acc_norm": 0.5065359477124183,
"acc_norm_stderr": 0.028627470550556047
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5209003215434084,
"acc_stderr": 0.028373270961069414,
"acc_norm": 0.5209003215434084,
"acc_norm_stderr": 0.028373270961069414
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5061728395061729,
"acc_stderr": 0.027818623962583295,
"acc_norm": 0.5061728395061729,
"acc_norm_stderr": 0.027818623962583295
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.028782227561347243,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.028782227561347243
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35267275097783574,
"acc_stderr": 0.012203286846053887,
"acc_norm": 0.35267275097783574,
"acc_norm_stderr": 0.012203286846053887
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.030161911930767102,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.030161911930767102
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4738562091503268,
"acc_stderr": 0.020200164564804588,
"acc_norm": 0.4738562091503268,
"acc_norm_stderr": 0.020200164564804588
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4636363636363636,
"acc_stderr": 0.047764491623961985,
"acc_norm": 0.4636363636363636,
"acc_norm_stderr": 0.047764491623961985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5183673469387755,
"acc_stderr": 0.03198761546763127,
"acc_norm": 0.5183673469387755,
"acc_norm_stderr": 0.03198761546763127
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355042,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6491228070175439,
"acc_stderr": 0.03660298834049163,
"acc_norm": 0.6491228070175439,
"acc_norm_stderr": 0.03660298834049163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184824,
"mc2": 0.4378655189377286,
"mc2_stderr": 0.01517288496510812
},
"harness|winogrande|5": {
"acc": 0.6945540647198106,
"acc_stderr": 0.012945038632552032
},
"harness|gsm8k|5": {
"acc": 0.5360121304018196,
"acc_stderr": 0.013736715929950315
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Joe02/Monobe_refs | ---
license: other
---
|
lshowway/reorder.ovs.es | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1091423892
num_examples: 566216
download_size: 513562772
dataset_size: 1091423892
---
# Dataset Card for "reorder.ovs.es"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pvduy/rlfh_airoboros | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 57571108
num_examples: 34204
download_size: 31025800
dataset_size: 57571108
---
# Dataset Card for "rlfh_airoboros"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nehc/splats | ---
license: mit
---
|
thorirhrafn/rmh_subset_medium3 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 675345333
num_examples: 282160
- name: test
num_bytes: 7312547
num_examples: 2000
- name: eval
num_bytes: 4240806
num_examples: 2000
download_size: 418273250
dataset_size: 686898686
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: eval
path: data/eval-*
---
|
w2f2eaa/graves | ---
license: openrail
---
|
irds/c4_en-noclean-tr_trec-misinfo-2021 | ---
pretty_name: '`c4/en-noclean-tr/trec-misinfo-2021`'
viewer: false
source_datasets: ['irds/c4_en-noclean-tr']
task_categories:
- text-retrieval
---
# Dataset Card for `c4/en-noclean-tr/trec-misinfo-2021`
The `c4/en-noclean-tr/trec-misinfo-2021` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/c4#c4/en-noclean-tr/trec-misinfo-2021).
# Data
This dataset provides:
- `queries` (i.e., topics); count=50
- For `docs`, use [`irds/c4_en-noclean-tr`](https://huggingface.co/datasets/irds/c4_en-noclean-tr)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/c4_en-noclean-tr_trec-misinfo-2021', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ..., 'description': ..., 'narrative': ..., 'disclaimer': ..., 'stance': ..., 'evidence': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
|
seonglae/wikipedia-256-token | ---
dataset_info:
config_name: gpt-4
features:
- name: id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: token_length
dtype: int64
- name: text_length
dtype: int64
splits:
- name: train
num_bytes: 23230980331
num_examples: 21462234
download_size: 12219882718
dataset_size: 23230980331
configs:
- config_name: gpt-4
data_files:
- split: train
path: gpt-4/train-*
---
This is Wikidedia passages dataset for ODQA retriever.
Each passages have 256~ tokens splitteed by gpt-4 tokenizer using tiktoken.
Token count
```ts
{'~128': 1415068, '128~256': 1290011,
'256~512': 18756476, '512~1024': 667,
'1024~2048': 12, '2048~4096': 0, '4096~8192': 0,
'8192~16384': 0, '16384~32768': 0, '32768~65536': 0,
'65536~128000': 0, '128000~': 0}
```
Text count
```ts
{'~512': 1556876,'512~1024': 6074975, '1024~2048': 13830329,
'2048~4096': 49, '4096~8192': 2, '8192~16384': 3, '16384~32768': 0,
'32768~65536': 0, '65536~': 0}
```
Token percent
```ts
{'~128': '6.59%', '128~256': '6.01%', '256~512': '87.39%',
'512~1024': '0.00%', '1024~2048': '0.00%', '2048~4096': '0.00%',
'4096~8192': '0.00%', '8192~16384': '0.00%', '16384~32768': '0.00%',
'32768~65536': '0.00%', '65536~128000': '0.00%', '128000~': '0.00%'}
```
Text percent
```ts
{'~512': '7.25%', '512~1024': '28.31%', '1024~2048': '64.44%',
'2048~4096': '0.00%', '4096~8192': '0.00%', '8192~16384': '0.00%',
'16384~32768': '0.00%', '32768~65536': '0.00%', '65536~': '0.00%'}
``` |
lprevelige/email-llm | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 688464.0
num_examples: 84
- name: test
num_bytes: 81960.0
num_examples: 10
download_size: 377997
dataset_size: 770424.0
---
# Dataset Card for "email-llm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
communityai/Telugu-LLM-Labs___sindhi_alpaca_yahma_cleaned_filtered | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 51520190.0
num_examples: 28910
download_size: 21852652
dataset_size: 51520190.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AlignmentResearch/PasswordMatch | ---
dataset_info:
- config_name: default
features:
- name: text
dtype: string
- name: chunked_text
sequence: string
- name: clf_label
dtype: int64
splits:
- name: train
num_bytes: 7923540.0
num_examples: 25000
- name: validation
num_bytes: 7921928.0
num_examples: 25000
download_size: 2549432
dataset_size: 15845468.0
- config_name: neg
features:
- name: text
dtype: string
- name: chunked_text
sequence: string
- name: clf_label
dtype: int64
splits:
- name: train
num_bytes: 3961770.0
num_examples: 12500
- name: validation
num_bytes: 3960964.0
num_examples: 12500
download_size: 1390805
dataset_size: 7922734.0
- config_name: pos
features:
- name: text
dtype: string
- name: chunked_text
sequence: string
- name: clf_label
dtype: int64
splits:
- name: train
num_bytes: 3961770.0
num_examples: 12500
- name: validation
num_bytes: 3960964.0
num_examples: 12500
download_size: 1158515
dataset_size: 7922734.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- config_name: neg
data_files:
- split: train
path: neg/train-*
- split: validation
path: neg/validation-*
- config_name: pos
data_files:
- split: train
path: pos/train-*
- split: validation
path: pos/validation-*
---
|
liuyanchen1015/MULTI_VALUE_wnli_double_obj_order | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 4591
num_examples: 28
- name: test
num_bytes: 9397
num_examples: 34
- name: train
num_bytes: 47263
num_examples: 282
download_size: 31001
dataset_size: 61251
---
# Dataset Card for "MULTI_VALUE_wnli_double_obj_order"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
useSword/runpod_Lora_Style | ---
license: apache-2.0
---
|
noisy-alpaca-test/MUSAN-noise | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: speech_input
dtype: string
- name: clean_audio
dtype: audio
- name: noisy_10dB
dtype: audio
- name: noisy_5dB
dtype: audio
- name: noisy_0dB
dtype: audio
- name: noisy_-5dB
dtype: audio
- name: noisy_-10dB
dtype: audio
- name: noisy_-20dB
dtype: audio
splits:
- name: test
num_bytes: 6795900010.1
num_examples: 5135
download_size: 6713612610
dataset_size: 6795900010.1
---
# Dataset Card for "MUSAN-noise"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rungalileo/20_Newsgroups_Fixed | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- unknown
multilinguality:
- monolingual
pretty_name: 20_Newsgroups_Fixed
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-class-classification
- topic-classification
---
# Dataset Card for 20_Newsgroups_Fixed
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Galileo Homepage:** [Galileo ML Data Intelligence Platform](https://www.rungalileo.io)
- **Repository:** [Needs More Information]
- **Dataset Blog:** [Improving Your ML Datasets With Galileo, Part 1](https://www.rungalileo.io/blog/)
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Needs More Information]
- **Sklearn Dataset:** [sklearn](https://scikit-learn.org/0.19/datasets/twenty_newsgroups.html#the-20-newsgroups-text-dataset)
- **20 Newsgroups Homepage:** [newsgroups homepage](http://qwone.com/~jason/20Newsgroups/)
### Dataset Summary
This dataset is a version of the [**20 Newsgroups**](https://scikit-learn.org/0.19/datasets/twenty_newsgroups.html#the-20-newsgroups-text-dataset) dataset fixed with the help of the [**Galileo ML Data Intelligence Platform**](https://www.rungalileo.io/). In a matter of minutes, Galileo enabled us to uncover and fix a multitude of errors within the original dataset. In the end, we present this improved dataset as a new standard for natural language experimentation and benchmarking using the Newsgroups dataset.
### Curation Rationale
This dataset was created to showcase the power of Galileo as a Data Intelligence Platform. Through Galileo, we identify critical error patterns within the original Newsgroups training dataset - garbage data that do not properly fit any newsgroup label category. Moreover, we observe that these errors permeate throughout the test dataset.
As a result of our analysis, we propose the addition of a new class to properly categorize and fix the labeling of garbage data samples: a "None" class. Galileo further enables us to quickly make these data sample changes within the training set (changing garbage data labels to None) and helps guide human re-annotation of the test set.
#### Total Dataset Errors Fixed: 1163 *(6.5% of the dataset)*
|Errors / Split. |Overall| Train| Test|
|---------------------|------:|---------:|---------:|
|Garbage samples fixed| 718| 396| 322|
|Empty samples fixed | 445| 254| 254|
|Total samples fixed | 1163| 650| 650|
To learn more about the process of fixing this dataset, please refer to our [**Blog**](https://www.rungalileo.io/blog).
## Dataset Structure
### Data Instances
For each data sample, there is the text of the newsgroup post, the corresponding newsgroup forum where the message was posted (label), and a data sample id.
An example from the dataset looks as follows:
```
{'id': 1,
'text': 'I have win 3.0 and downloaded several icons and BMP\'s but I can\'t figure out\nhow to change the "wallpaper" or use the icons. Any help would be appreciated.\n\n\nThanx,\n\n-Brando'
'label': comp.os.ms-windows.misc}
```
### Data Fields
- id: the unique numerical id associated with a data sample
- text: a string containing the text of the newsgroups message
- label: a string indicating the newsgroup forum where the sample was posted
### Data Splits
The data is split into a training and test split. To reduce bias and test generalizability across time, data samples are split between train and test depending upon whether their message was posted before or after a specific date, respectively.
### Data Classes
The fixed data is organized into 20 newsgroup topics + a catch all "None" class. Some of the newsgroups are very closely related to each other (e.g. comp.sys.ibm.pc.hardware / comp.sys.mac.hardware), while others are highly unrelated (e.g misc.forsale / soc.religion.christian). Here is a list of the 21 classes, partitioned according to subject matter:
| comp.graphics<br>comp.os.ms-windows.misc<br>comp.sys.ibm.pc.hardware<br>comp.sys.mac.hardware<br>comp.windows.x | rec.autos<br>rec.motorcycles<br>rec.sport.baseball<br>rec.sport.hockey | sci.crypt<br><sci.electronics<br>sci.med<br>sci.space |
|:---|:---:|---:|
| misc.forsale | talk.politics.misc<br>talk.politics.guns<br>talk.politics.mideast | talk.religion.misc<br>alt.atheism<br>soc.religion.christian |
| None |
|
jurnu/d | ---
license: openrail
language:
- es
--- |
shi3z/MTbenchJapanese | ---
license: mit
---
Japanese translated file from Vicuna MT bench question.jsonl
https://github.com/lm-sys/FastChat/blob/main/fastchat/llm_judge/data/mt_bench/question.jsonl
fixed by npaka
https://note.com/npaka/n/na28f31e96599 |
CarolLiu999/ML-ESG-3-Train | ---
license: apache-2.0
---
|
tzvc/organization-logos | ---
task_categories:
- zero-shot-classification
language:
- en
tags:
- logos
size_categories:
- 1M<n<10M
---
Org logos |
bethgelab/SyntheticTypeIdent | ---
license: mit
task_categories:
- image-classification
language:
- en
pretty_name: SyntheticTypeIdent
size_categories:
- 1K<n<10K
---
This is the SyntheticTypeIdent dataset from the paper [Visual Data-Type Understanding does not emerge from Scaling Vision-Language Models](https://arxiv.org/abs/2310.08577) |
heegyu/ko-openchat-0406 | ---
dataset_info:
features:
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 5732964203.359212
num_examples: 2302452
- name: test
num_bytes: 2489938.6407878264
num_examples: 1000
download_size: 2826393454
dataset_size: 5735454142.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
다음 공개된 데이터를 모두 포멧 통일 후 병합. 이후 1000개를 무작위로 추출하여 test set으로 사용
### 지시문 수행(Instruction-Following), 추론(Reasoning), 일반상식(Commonsense)
이 데이터들에도 수학, 코딩 데이터가 섞여있긴 합니다
- [FreedomIntelligence/evol-instruct-korean](https://huggingface.co/datasets/FreedomIntelligence/evol-instruct-korean)
- [heegyu/OpenOrca-gugugo-ko-len500](https://huggingface.co/datasets/heegyu/OpenOrca-gugugo-ko-len500)
- [MarkrAI/KoCommercial-Dataset](https://huggingface.co/datasets/MarkrAI/KoCommercial-Dataset)
- [heegyu/CoT-collection-ko](https://huggingface.co/datasets/heegyu/CoT-collection-ko)
- [changpt/ko-lima-vicuna](https://huggingface.co/datasets/changpt/ko-lima-vicuna)
- [maywell/koVast](https://huggingface.co/datasets/maywell/koVast)
- [dbdu/ShareGPT-74k-ko](https://huggingface.co/datasets/dbdu/ShareGPT-74k-ko)
- [HuggingFaceH4/ultrachat_200k](https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k)
- [Open-Orca/SlimOrca-Dedup](https://huggingface.co/datasets/Open-Orca/SlimOrca-Dedup)
### 수학, 코딩, 함수 호출 (Function Calling)
- [heegyu/glaive-function-calling-v2-ko](https://huggingface.co/datasets/heegyu/glaive-function-calling-v2-ko)
- [kuotient/gsm8k-ko](https://huggingface.co/datasets/kuotient/gsm8k-ko)
- [glaiveai/glaive-code-assistant-v2](https://huggingface.co/datasets/glaiveai/glaive-code-assistant-v2)
### 안전성 (Safety), 상담
- [heegyu/HRC](https://huggingface.co/datasets/heegyu/HRC)
- [heegyu/kor_counselgpt_multiturn](https://huggingface.co/datasets/heegyu/kor_counselgpt_multiturn)
- [MrBananaHuman/kor_ethical_question_answer](https://huggingface.co/datasets/MrBananaHuman/kor_ethical_question_answer)
- [heegyu/PKU-SafeRLHF-ko](https://huggingface.co/datasets/heegyu/PKU-SafeRLHF-ko)
|
BoooomNing/maow | ---
dataset_info:
features:
- name: id
dtype: int32
- name: input_image
dtype: image
- name: edit_prompt
dtype: string
- name: edit_pose
dtype: image
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 1134571376.356
num_examples: 6243
download_size: 1114613827
dataset_size: 1134571376.356
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mubarak-alketbi/MMLab-documentation-MMEngine | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 910848
num_examples: 70
download_size: 332061
dataset_size: 910848
---
# Dataset Card for "MMLab-documentation-MMEngine"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Othmanotana/d | ---
license: unknown
---
|
dadtheimpaler/test | ---
license: cc
---
|
lzh7522/til_nlp_test_dataset | ---
dataset_info:
features:
- name: path
dtype: string
- name: input_features
sequence:
sequence: float32
splits:
- name: train
num_bytes: 11524464000
num_examples: 12000
download_size: 1788670661
dataset_size: 11524464000
---
# Dataset Card for "til_nlp_test_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
OGB/ogbg-ppa | ---
license: cc0-1.0
task_categories:
- graph-ml
---
# Dataset Card for ogbg-ppa
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [External Use](#external-use)
- [PyGeometric](#pygeometric)
- [Dataset Structure](#dataset-structure)
- [Data Properties](#data-properties)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **[Homepage](https://ogb.stanford.edu/docs/graphprop/#ogbg-ppa)**
- **[Repository](https://github.com/snap-stanford/ogb):**:
- **Paper:**: Open Graph Benchmark: Datasets for Machine Learning on Graphs (see citation)
- **Leaderboard:**: [OGB leaderboard](https://ogb.stanford.edu/docs/leader_graphprop/#ogbg-ppa) and [Papers with code leaderboard](https://paperswithcode.com/sota/graph-property-prediction-on-ogbg-ppa)
### Dataset Summary
The `ogbg-ppa` dataset is "a set of undirected protein association neighborhoods extracted from the protein-protein association networks of 1,581 species", over 37 taxonomic groups, by teams at Stanford, to be a part of the Open Graph Benchmark. See their website for dataset postprocessing.
### Supported Tasks and Leaderboards
`ogbg-ppa` should be used for taxonomic group prediction, a 37-way multi-class classification task. The score used is Average Precision on the test set.
## External Use
### PyGeometric
To load in PyGeometric, do the following:
```python
from datasets import load_dataset
from torch_geometric.data import Data
from torch_geometric.loader import DataLoader
graphs_dataset = load_dataset("graphs-datasets/ogbg-ppa")
# For the train set (replace by valid or test as needed)
graphs_list = [Data(graph) for graph in graphs_dataset["train"]]
graphs_pygeometric = DataLoader(graph_list)
```
## Dataset Structure
### Data Properties
| property | value |
|---|---|
| scale | small |
| #graphs | 158,100 |
| average #nodes | 243.4 |
| average #edges | 2,266.1 |
| average node degree | 18.3 |
| average cluster coefficient | 0.513 |
| MaxSCC ratio | 1.000 |
| graph diameter | 4.8 |
### Data Fields
Each row of a given file is a graph, with:
- `edge_index` (list: 2 x #edges): pairs of nodes constituting edges
- `edge_attr` (list: #edges x #edge-features): for the aforementioned edges, contains their features
- `y` (list: 1 x #labels): contains the number of labels available to predict (here 1, equal to zero or one)
- `num_nodes` (int): number of nodes of the graph
The nodes don't have specific features and are implicit from the lists of edges
### Data Splits
This data comes from the PyGeometric version of the dataset provided by OGB, and follows the provided data splits.
This information can be found back using
```python
from ogb.graphproppred import PygGraphPropPredDataset
dataset = PygGraphPropPredDataset(name = 'ogbg-ppa')
split_idx = dataset.get_idx_split()
train = dataset[split_idx['train']] # valid, test
```
## Additional Information
### Licensing Information
The dataset has been released under CC-0 license.
### Citation Information
```
@inproceedings{hu-etal-2020-open,
author = {Weihua Hu and
Matthias Fey and
Marinka Zitnik and
Yuxiao Dong and
Hongyu Ren and
Bowen Liu and
Michele Catasta and
Jure Leskovec},
editor = {Hugo Larochelle and
Marc Aurelio Ranzato and
Raia Hadsell and
Maria{-}Florina Balcan and
Hsuan{-}Tien Lin},
title = {Open Graph Benchmark: Datasets for Machine Learning on Graphs},
booktitle = {Advances in Neural Information Processing Systems 33: Annual Conference
on Neural Information Processing Systems 2020, NeurIPS 2020, December
6-12, 2020, virtual},
year = {2020},
url = {https://proceedings.neurips.cc/paper/2020/hash/fb60d411a5c5b72b2e7d3527cfc84fd0-Abstract.html},
}
```
### Contributions
Thanks to [@clefourrier](https://github.com/clefourrier) for adding this dataset. |
deepachalapathi/essay_grade_v1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 2547964
num_examples: 1427
- name: validation
num_bytes: 255332.0616678346
num_examples: 143
download_size: 0
dataset_size: 2803296.0616678344
---
# Dataset Card for "essay_grade_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mteb/germanquad-retrieval-qrels | ---
configs:
- config_name: default
data_files:
- split: test
path: "test/data-00000-of-00001.arrow"
license: cc-by-4.0
language:
- de
source_datasets:
- "deepset/germanquad"
---
This dataset is derived from the [GermanQuAD](https://www.deepset.ai/germanquad) dataset.
This dataset takes the testset and represents it as qrels in the [BEIR](https://github.com/beir-cellar/beir) information retrieval benchmark format.
Corpus and query ids have been added.
The corresponding corpus can be found [here](https://huggingface.co/datasets/mteb/germanquad-retrieval).
Full credit for the original dataset goes to the [authors](https://arxiv.org/abs/2104.12741) of the GermanQuAD [dataset](https://huggingface.co/datasets/deepset/germandpr).
The original dataset is licensed under [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/).
Citation for the original dataset:
```
@misc{möller2021germanquad,
title={GermanQuAD and GermanDPR: Improving Non-English Question Answering and Passage Retrieval},
author={Timo Möller and Julian Risch and Malte Pietsch},
year={2021},
eprint={2104.12741},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
The derived dataset was created by [rasdani](https://huggingface.com/rasdani).
|
chenqile09/tang-poems-with-keywords | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
dataset_info:
features:
- name: author
dtype: string
- name: title
dtype: string
- name: paragraph
dtype: string
- name: keywords
dtype: string
- name: text
dtype: string
splits:
- name: test
num_bytes: 2464318
num_examples: 5274
- name: train
num_bytes: 16842216
num_examples: 36000
download_size: 12757028
dataset_size: 19306534
---
# Dataset Card for "tang-poems-with-keywords"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JzJd/post-cw | ---
license: afl-3.0
---
|
open-llm-leaderboard/details_Undi95__ReMM-Mistral-13B | ---
pretty_name: Evaluation run of Undi95/ReMM-Mistral-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/ReMM-Mistral-13B](https://huggingface.co/Undi95/ReMM-Mistral-13B) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__ReMM-Mistral-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-27T13:48:21.267659](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-Mistral-13B/blob/main/results_2023-10-27T13-48-21.267659.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.20679530201342283,\n\
\ \"em_stderr\": 0.004147654995169029,\n \"f1\": 0.2796350671140937,\n\
\ \"f1_stderr\": 0.004133652397455312,\n \"acc\": 0.4328064778452021,\n\
\ \"acc_stderr\": 0.01060870762734275\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.20679530201342283,\n \"em_stderr\": 0.004147654995169029,\n\
\ \"f1\": 0.2796350671140937,\n \"f1_stderr\": 0.004133652397455312\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12054586808188021,\n \
\ \"acc_stderr\": 0.008968608285309076\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.745067087608524,\n \"acc_stderr\": 0.012248806969376422\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Undi95/ReMM-Mistral-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|arc:challenge|25_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_27T13_48_21.267659
path:
- '**/details_harness|drop|3_2023-10-27T13-48-21.267659.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-27T13-48-21.267659.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_27T13_48_21.267659
path:
- '**/details_harness|gsm8k|5_2023-10-27T13-48-21.267659.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-27T13-48-21.267659.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hellaswag|10_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-43-52.595565.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T08-43-52.595565.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T08-43-52.595565.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_27T13_48_21.267659
path:
- '**/details_harness|winogrande|5_2023-10-27T13-48-21.267659.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-27T13-48-21.267659.parquet'
- config_name: results
data_files:
- split: 2023_10_04T08_43_52.595565
path:
- results_2023-10-04T08-43-52.595565.parquet
- split: 2023_10_27T13_48_21.267659
path:
- results_2023-10-27T13-48-21.267659.parquet
- split: latest
path:
- results_2023-10-27T13-48-21.267659.parquet
---
# Dataset Card for Evaluation run of Undi95/ReMM-Mistral-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/ReMM-Mistral-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/ReMM-Mistral-13B](https://huggingface.co/Undi95/ReMM-Mistral-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__ReMM-Mistral-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-27T13:48:21.267659](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-Mistral-13B/blob/main/results_2023-10-27T13-48-21.267659.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.20679530201342283,
"em_stderr": 0.004147654995169029,
"f1": 0.2796350671140937,
"f1_stderr": 0.004133652397455312,
"acc": 0.4328064778452021,
"acc_stderr": 0.01060870762734275
},
"harness|drop|3": {
"em": 0.20679530201342283,
"em_stderr": 0.004147654995169029,
"f1": 0.2796350671140937,
"f1_stderr": 0.004133652397455312
},
"harness|gsm8k|5": {
"acc": 0.12054586808188021,
"acc_stderr": 0.008968608285309076
},
"harness|winogrande|5": {
"acc": 0.745067087608524,
"acc_stderr": 0.012248806969376422
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kpriyanshu256/MultiTabQA-multitable_pretraining-train-v2-56000 | ---
dataset_info:
features:
- name: tables
sequence: string
- name: table_names
sequence: string
- name: query
dtype: string
- name: answer
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: source_latex
dtype: string
- name: target_latex
dtype: string
- name: source_html
dtype: string
- name: target_html
dtype: string
- name: source_markdown
dtype: string
- name: target_markdown
dtype: string
splits:
- name: train
num_bytes: 5615221896
num_examples: 1000
download_size: 1141552640
dataset_size: 5615221896
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
arthurmluz/temario_data-cstnews_results | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: summary
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
- name: moverScore
dtype: float64
splits:
- name: validation
num_bytes: 235004
num_examples: 25
download_size: 187245
dataset_size: 235004
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "temario_data-cstnews_results"
rouge= {'rouge1': 0.4625552716828615, 'rouge2': 0.19128215243444158, 'rougeL': 0.2812235162681903, 'rougeLsum': 0.2812235162681903}
bert= {'precision': 0.731306676864624, 'recall': 0.7333952784538269, 'f1': 0.7321366620063782}
moverscore = 0.632796284594281 |
DynamicSuperb/SingerVerification_M4Singer | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: file2
dtype: string
- name: audio2
dtype: audio
- name: instruction
dtype: string
- name: label
dtype: string
splits:
- name: test
num_bytes: 729851376.0
num_examples: 2000
download_size: 700165696
dataset_size: 729851376.0
---
# Dataset Card for "SingerVerification"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ccmusic-database/song_structure | ---
license: mit
task_categories:
- time-series-forecasting
language:
- en
tags:
- music
- art
pretty_name: Song Structure Annotation Database
size_categories:
- n<1K
viewer: false
---
# Dataset Card for Song Structure
The raw dataset comprises 300 pop songs in .mp3 format, sourced from the NetEase music, accompanied by a structure annotation file for each song in .txt format. The annotator for music structure is a professional musician and teacher from the China Conservatory of Music. For the statistics of the dataset, there are 208 Chinese songs, 87 English songs, three Korean songs and two Japanese songs. The song structures are labeled as follows: intro, re-intro, verse, chorus, pre-chorus, post-chorus, bridge, interlude and ending. Fig. 7 shows the frequency of each segment label that appears in the set. The labels chorus and verse are the two most prevalent segment labels in the dataset and they are the most common segment in Western popular music. Among them, the number of “Postchorus” tags is the least, with only two present.
## Dataset Description
- **Homepage:** <https://ccmusic-database.github.io>
- **Repository:** <https://huggingface.co/datasets/CCMUSIC/song_structure>
- **Paper:** <https://doi.org/10.5281/zenodo.5676893>
- **Leaderboard:** <https://ccmusic-database.github.io/team.html>
- **Point of Contact:** <https://www.modelscope.cn/datasets/ccmusic/song_structure>
### Dataset Summary
Unlike the above three datasets for classification, this one has not undergone pre-processing such as spectrogram transform. Thus we provide the original content only. The integrated version of the dataset is organized based on audio files, with each item structured into three columns: The first column contains the audio of the song in .mp3 format, sampled at 22,050 Hz. The second column consists of lists indicating the time points that mark the boundaries of different song sections, while the third column contains lists corresponding to the labels of the song structures listed in the second column. Strictly speaking, the first column represents the data, while the subsequent two columns represent the label.
### Supported Tasks and Leaderboards
time-series-forecasting
### Languages
Chinese, English
## Usage
```python
from datasets import load_dataset
dataset = load_dataset("ccmusic-database/song_structure")
for item in ds["train"]:
print(item)
for item in ds["validation"]:
print(item)
for item in ds["test"]:
print(item)
```
## Dataset Structure
| audio(.wav, 22050Hz) | mel(.jpg, 22050Hz) | label |
| :-------------------------------------------------------------------------------------------------------------------------------------: | :-------------------------------------------: | :-----------------------------------------------------: |
| <audio controls src="https://huggingface.co/datasets/ccmusic-database/song_structure/resolve/main/data/Pentatonix%20-%20Valentine.mp3"> | <img src="./data/Pentatonix - Valentine.jpg"> | {onset_time:uint32,offset_time:uint32,structure:string} |
| ... | ... | ... |
### Data Instances
.wav, .txt
### Data Fields
```
intro, chorus, verse, pre-chorus, post-chorus, bridge, ending
```
### Data Splits
train, valid, test
## Dataset Creation
### Curation Rationale
Lack of a dataset for song structure
### Source Data
#### Initial Data Collection and Normalization
Zhaorui Liu, Monan Zhou
#### Who are the source language producers?
Students from CCMUSIC
### Annotations
#### Annotation process
Students from CCMUSIC collected 300 pop songs, as well as a structure annotation file for each song
#### Who are the annotators?
Students from CCMUSIC
### Personal and Sensitive Information
Due to copyright issues with the original music, only features of audio by frame are provided in the dataset
## Considerations for Using the Data
### Social Impact of Dataset
Promoting the development of the AI music industry
### Discussion of Biases
Only for mp3
### Other Known Limitations
Most are Chinese songs
## Additional Information
### Dataset Curators
Zijin Li
### Licensing Information
```
MIT License
Copyright (c) CCMUSIC
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
```
### Citation Information
```bibtex
@dataset{zhaorui_liu_2021_5676893,
author = {Monan Zhou, Shenyang Xu, Zhaorui Liu, Zhaowen Wang, Feng Yu, Wei Li and Baoqiang Han},
title = {CCMusic: an Open and Diverse Database for Chinese and General Music Information Retrieval Research},
month = {mar},
year = {2024},
publisher = {HuggingFace},
version = {1.2},
url = {https://huggingface.co/ccmusic-database}
}
```
### Contributions
Provide a dataset for song structure |
Jayfeather1024/Reward-Embeddings-30k | ---
license: unknown
---
# RLHF Reward Model Embedding Features for PKU-Alignment/PKU-SafeRLHF Dataset
The RLHF reward model embedding features and corresponding original text are stored in `embeddings_train_30k.jsonl` and `embeddings_test.jsonl`.
The dataset is stored in pairwise ways: each data pair has 1) safer_example: input text of the safer example, 2) not_safer_example: input text of the more harmful example, 3) safer_embedding: embedding feature of the safer example, 4) not_safer_embedding: embedding feature of the more harmful example.
The hidden embedding dimension is 4096. The reward model uses a linear layer to transfer the embedding features into a 1-dimensional score value.
Note: The dataset is extremely large because of the large size of the original training dataset and the high dimension of embedding space.
# Original Dataset
If you need more detailed information about the original dataset, please refer to `train.jsonl.xz` and `test.jsonl.xz`. Since we use `shuffle=False` when generating the embeddings, orders are remained in our dataset.
# Note
This dataset is a processed version of PKU-Alignment/PKU-SafeRLHF: <https://huggingface.co/datasets/PKU-Alignment/PKU-SafeRLHF>. |
Sachin7/HomeTeamPrediction2 | ---
dataset_info:
features:
- name: date
dtype: string
- name: home_team
dtype: string
- name: away_team
dtype: string
- name: tournament
dtype: string
- name: city
dtype: string
- name: country
dtype: string
- name: neutral
dtype: bool
- name: result
dtype: int64
splits:
- name: train
num_bytes: 3807471
num_examples: 41660
download_size: 1089211
dataset_size: 3807471
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nganlt/CVE_explain_dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 47186572
num_examples: 44988
download_size: 11209470
dataset_size: 47186572
---
# Dataset Card for "CVE_explain_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FINNUMBER/FINCH_TRAIN_SA_200_per100_NEW_Rationale | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: sub_task
dtype: string
- name: rationale
dtype: string
- name: correct
dtype: bool
- name: check
dtype: bool
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1039394
num_examples: 200
download_size: 563335
dataset_size: 1039394
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Az-r-ow/chest_xray | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': NORMAL
'1': PNEUMONIA
splits:
- name: train
num_bytes: 3186635036.504
num_examples: 5216
- name: validation
num_bytes: 3030633
num_examples: 16
- name: test
num_bytes: 79062317
num_examples: 624
download_size: 1230487052
dataset_size: 3268727986.504
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
license: mit
task_categories:
- image-classification
language:
- en
- fr
tags:
- medical
size_categories:
- 1K<n<10K
---
# Zoidberg2.0
The data has been taken from [kaggle](https://www.kaggle.com/datasets/paultimothymooney/chest-xray-pneumonia)
## Usage
Install Hugging Face's `datasets` library
```bash
pip install datasets
```
Load the dataset with the following lines
```python
from datasets import load_dataset
dataset = load_dataset("Az-r-ow/chest_xray")
```
For more information on how to manipulate the data checkout the [docs](https://huggingface.co/docs/datasets/load_hub) |
huggingartists/enigma | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/enigma"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.268668 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/4b5472082f220eb9c2ca6b22f4d12f45.1000x1000x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/enigma">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Enigma</div>
<a href="https://genius.com/artists/enigma">
<div style="text-align: center; font-size: 14px;">@enigma</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/enigma).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/enigma")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|277| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/enigma")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
thobauma/harmless-eval-SUDO | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: clean
num_bytes: 3177260
num_examples: 2312
- name: poisoned
num_bytes: 3200575
num_examples: 2312
download_size: 3546580
dataset_size: 6377835
configs:
- config_name: default
data_files:
- split: clean
path: data/clean-*
- split: poisoned
path: data/poisoned-*
---
|
maidalun1020/CrosslingualRetrievalPaperEn2Zh | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: queries
num_bytes: 6528615
num_examples: 22456
- name: corpus
num_bytes: 3062572
num_examples: 5076
download_size: 6492843
dataset_size: 9591187
---
|
gryffindor-ISWS/generated-data-fictional-characters-with-images | ---
license: gpl-3.0
task_categories:
- text-to-image
language:
- en
tags:
- art
size_categories:
- 1K<n<10K
pretty_name: Generated images for fictional characters with images in Wikidata
--- |
satcos/gym_replay | ---
license: apache-2.0
---
|
bjoernp/gaps_de | ---
dataset_info:
features:
- name: sentences
dtype: string
- name: sentences_de
dtype: string
splits:
- name: train
num_bytes: 45790544999
num_examples: 178674546
download_size: 26834208249
dataset_size: 45790544999
---
# Dataset Card for "gaps_de"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mii-llm/istruzioni-merge | ---
dataset_info:
features:
- name: type
dtype: string
- name: prompt
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 184861486
num_examples: 106744
download_size: 100976033
dataset_size: 184861486
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "istruzioni-merge"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sorenlarson/test | ---
license: openrail
---
|
DKYoon/aya_dataset_ko | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: language
dtype: string
- name: language_code
dtype: string
- name: annotation_type
dtype: string
- name: user_id
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 313877
num_examples: 361
download_size: 172344
dataset_size: 313877
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/6c06c658 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1338
dataset_size: 182
---
# Dataset Card for "6c06c658"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lmqg/qg_squadshifts | ---
license: cc-by-4.0
pretty_name: SubjQA for question generation
language: en
multilinguality: monolingual
size_categories: 10K<n<100K
source_datasets: subjqa
task_categories:
- text-generation
task_ids:
- language-modeling
tags:
- question-generation
---
# Dataset Card for "lmqg/qg_squadshifts"
## Dataset Description
- **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
- **Paper:** [https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)
- **Point of Contact:** [Asahi Ushio](http://asahiushio.com/)
### Dataset Summary
This is a subset of [QG-Bench](https://github.com/asahi417/lm-question-generation/blob/master/QG_BENCH.md#datasets), a unified question generation benchmark proposed in
["Generative Language Models for Paragraph-Level Question Generation: A Unified Benchmark and Evaluation, EMNLP 2022 main conference"](https://arxiv.org/abs/2210.03992).
Modified version of [SQuADShifts](https://modestyachts.github.io/squadshifts-website/index.html) for question generation (QG) task.
### Supported Tasks and Leaderboards
* `question-generation`: The dataset can be used to train a model for question generation.
Success on this task is typically measured by achieving a high BLEU4/METEOR/ROUGE-L/BERTScore/MoverScore (see our paper for more in detail).
### Languages
English (en)
## Dataset Structure
An example of 'train' looks as follows.
```
{
"question": "has there ever been a legal challange?",
"paragraph": "The status of the Armenian Apostolic Church within the Republic of Armenia is defined in the country's constitution. Article 8.1 of the Constitution of Armenia states: "The Republic of Armenia recognizes the exclusive historical mission of the Armenian Apostolic Holy Church as a national church, in the spiritual life, development of the national culture and preservation of the national identity of the people of Armenia." Among others, ethnographer Hranush Kharatyan has questioned the constitutionality of the phrase "national church".",
"answer": "Among others, ethnographer Hranush Kharatyan has questioned the constitutionality of the phrase "national church",
"sentence": "Article 8.1 of the Constitution of Armenia states: "The Republic of Armenia recognizes the exclusive historical mission of the Armenian Apostolic Holy Church as a national church, in the spiritual life, development of the national culture and preservation of the national identity of the people of Armenia." Among others, ethnographer Hranush Kharatyan has questioned the constitutionality of the phrase "national church",
"paragraph_sentence": "The status of the Armenian Apostolic Church within the Republic of Armenia is defined in the country's constitution. <hl> Article 8.1 of the Constitution of Armenia states: "The Republic of Armenia recognizes the exclusive historical mission of the Armenian Apostolic Holy Church as a national church, in the spiritual life, development of the national culture and preservation of the national identity of the people of Armenia." Among others, ethnographer Hranush Kharatyan has questioned the constitutionality of the phrase "national church". <hl>",
"paragraph_answer": "The status of the Armenian Apostolic Church within the Republic of Armenia is defined in the country's constitution. Article 8.1 of the Constitution of Armenia states: "The Republic of Armenia recognizes the exclusive historical mission of the Armenian Apostolic Holy Church as a national church, in the spiritual life, development of the national culture and preservation of the national identity of the people of Armenia." <hl> Among others, ethnographer Hranush Kharatyan has questioned the constitutionality of the phrase "national church". <hl>",
"sentence_answer": "Article 8.1 of the Constitution of Armenia states: "The Republic of Armenia recognizes the exclusive historical mission of the Armenian Apostolic Holy Church as a national church, in the spiritual life, development of the national culture and preservation of the national identity of the people of Armenia." <hl> Among others, ethnographer Hranush Kharatyan has questioned the constitutionality of the phrase "national church". <hl>"
}
```
The data fields are the same among all splits.
- `question`: a `string` feature.
- `paragraph`: a `string` feature.
- `answer`: a `string` feature.
- `sentence`: a `string` feature.
- `paragraph_answer`: a `string` feature, which is same as the paragraph but the answer is highlighted by a special token `<hl>`.
- `paragraph_sentence`: a `string` feature, which is same as the paragraph but a sentence containing the answer is highlighted by a special token `<hl>`.
- `sentence_answer`: a `string` feature, which is same as the sentence but the answer is highlighted by a special token `<hl>`.
Each of `paragraph_answer`, `paragraph_sentence`, and `sentence_answer` feature is assumed to be used to train a question generation model,
but with different information. The `paragraph_answer` and `sentence_answer` features are for answer-aware question generation and
`paragraph_sentence` feature is for sentence-aware question generation.
### Data Splits
| name |train | valid | test |
|-------------|------:|------:|-----:|
|default (all)|9209|6283 |18,844|
| amazon |3295|1648|4942|
| new_wiki |2646|1323|3969|
| nyt |3355|1678|5032|
| reddit |3268|1634|4901|
## Citation Information
```
@inproceedings{ushio-etal-2022-generative,
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
author = "Ushio, Asahi and
Alva-Manchego, Fernando and
Camacho-Collados, Jose",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, U.A.E.",
publisher = "Association for Computational Linguistics",
}
``` |
daje/tokenized_enwiki | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 24408319844
num_examples: 16370815
download_size: 10890317773
dataset_size: 24408319844
---
# Dataset Card for "tokenized_enwiki"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CAII-NCSA/argilladataset | ---
dataset_info:
features:
- name: input
dtype: string
- name: generations
sequence: string
- name: labelling_model
dtype: string
- name: labelling_prompt
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_labelling_response
dtype: string
- name: rating
sequence: float64
- name: rationale
dtype: string
splits:
- name: train
num_bytes: 1747793
num_examples: 817
download_size: 467472
dataset_size: 1747793
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
PhysHunter/github-datasets-issues | ---
annotations_creators: []
language:
- en
language_creators:
- found
license: []
multilinguality:
- monolingual
pretty_name: HuggingFace Datasets GitHub Issues
size_categories: []
source_datasets: []
tags: []
task_categories:
- text-classification
- text-retrieval
task_ids:
- multi-class-classification
- multi-label-classification
- document-retrieval
---
# Dataset Summary
HuggingFace Datasets GitHub Issues is a dataset consisting of issues and pullrequests associated with HuggingFace Datasets repository on GitHub. |
hts98/transfer_1.2_wave2vec | ---
dataset_info:
features:
- name: input_length
dtype: int64
- name: input_values
sequence: float32
- name: labels
sequence: int64
- name: labels_length
dtype: int64
splits:
- name: train
num_bytes: 5756279160
num_examples: 3420
- name: test
num_bytes: 1438032944
num_examples: 856
download_size: 7187743680
dataset_size: 7194312104
---
# Dataset Card for "transfer_1.2_wave2vec"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/find_second_sent_train_100_eval_10_hint5 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 270093
num_examples: 210
- name: validation
num_bytes: 10392
num_examples: 10
download_size: 139398
dataset_size: 280485
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "find_second_sent_train_100_eval_10_hint5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/watatsuki_no_yorihime_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of watatsuki_no_yorihime/綿月依姫 (Touhou)
This is the dataset of watatsuki_no_yorihime/綿月依姫 (Touhou), containing 130 images and their tags.
The core tags of this character are `purple_hair, long_hair, ponytail, ribbon, bow, hair_bow, hair_ribbon, breasts, red_eyes, large_breasts, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 130 | 127.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/watatsuki_no_yorihime_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 130 | 88.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/watatsuki_no_yorihime_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 287 | 165.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/watatsuki_no_yorihime_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 130 | 119.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/watatsuki_no_yorihime_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 287 | 204.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/watatsuki_no_yorihime_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/watatsuki_no_yorihime_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, white_panties, bangs, white_shirt, belt, collared_shirt, long_sleeves, open_clothes, red_dress, shiny_skin, collarbone, nipples, shiny_hair, thighs, very_long_hair, wing_collar |
| 1 | 5 |  |  |  |  |  | 1girl, belt, katana, solo, bracelet |
| 2 | 11 |  |  |  |  |  | 1girl, belt, bracelet, katana, solo, boots, fire, sheath |
| 3 | 32 |  |  |  |  |  | 1girl, hetero, blush, solo_focus, censored, nipples, penis, 1boy, pussy, sex, open_mouth, vaginal, cum, tears |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | solo | white_panties | bangs | white_shirt | belt | collared_shirt | long_sleeves | open_clothes | red_dress | shiny_skin | collarbone | nipples | shiny_hair | thighs | very_long_hair | wing_collar | katana | bracelet | boots | fire | sheath | hetero | solo_focus | censored | penis | 1boy | pussy | sex | open_mouth | vaginal | cum | tears |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------|:----------------|:--------|:--------------|:-------|:-----------------|:---------------|:---------------|:------------|:-------------|:-------------|:----------|:-------------|:---------|:-----------------|:--------------|:---------|:-----------|:--------|:-------|:---------|:---------|:-------------|:-----------|:--------|:-------|:--------|:------|:-------------|:----------|:------|:--------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | | X | | | | X | | | | | | | | | | | | X | X | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | | | X | | | | X | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | |
| 3 | 32 |  |  |  |  |  | X | X | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
pbaoo2705/cpgqa_processed_eval | ---
dataset_info:
features:
- name: title
dtype: string
- name: id
dtype: int64
- name: question
dtype: string
- name: answer_text
dtype: string
- name: answer_start
dtype: int64
- name: context
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: answer
dtype: string
- name: start_positions
dtype: int64
- name: end_positions
dtype: int64
splits:
- name: validation
num_bytes: 1212109
num_examples: 104
download_size: 35223
dataset_size: 1212109
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "cpgqa_processed_eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/python3-standardized_cluster_10_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 10709414
num_examples: 8232
download_size: 0
dataset_size: 10709414
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python3-standardized_cluster_10_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tolysim/test | ---
license: bigcode-openrail-m
---
|
syntaxsynth/instruct_code_cleaning | ---
size_categories:
- 10K<n<100K
task_categories:
- text-generation
dataset_info:
features:
- name: source
dtype: string
- name: task
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 333589265
num_examples: 94562
download_size: 104501596
dataset_size: 333589265
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# SFT code dataset building
Contain a list of tasks useful when building a iniitial dataset source:
1. reverse_translation
Given a history of conversations, what would the human ask next?
2. reverse_translation_first_round
Suppose you already have a response, the LLM must predict what question does the human asked
3. clean_code
Given a code snippet, it determines whether its useful and atomic enough to be use for a response by LLM
4. gen_code_question
Generates a question given a code snippet
|
arxiv_dataset | ---
annotations_creators:
- no-annotation
language_creators:
- expert-generated
language:
- en
license:
- cc0-1.0
multilinguality:
- monolingual
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- translation
- summarization
- text-retrieval
task_ids:
- document-retrieval
- entity-linking-retrieval
- explanation-generation
- fact-checking-retrieval
- text-simplification
paperswithcode_id: null
pretty_name: arXiv Dataset
dataset_info:
features:
- name: id
dtype: string
- name: submitter
dtype: string
- name: authors
dtype: string
- name: title
dtype: string
- name: comments
dtype: string
- name: journal-ref
dtype: string
- name: doi
dtype: string
- name: report-no
dtype: string
- name: categories
dtype: string
- name: license
dtype: string
- name: abstract
dtype: string
- name: update_date
dtype: string
splits:
- name: train
num_bytes: 3056873071
num_examples: 2349354
download_size: 0
dataset_size: 3056873071
---
# Dataset Card for arXiv Dataset
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Kaggle arXiv Dataset Homepage](https://www.kaggle.com/Cornell-University/arxiv)
- **Repository:**
- **Paper:** [On the Use of ArXiv as a Dataset](https://arxiv.org/abs/1905.00075)
- **Leaderboard:**
- **Point of Contact:** [Matt Bierbaum](mailto:matt.bierbaum@gmail.com)
### Dataset Summary
A dataset of 1.7 million arXiv articles for applications like trend analysis, paper recommender engines, category prediction, co-citation networks, knowledge graph construction and semantic search interfaces.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The language supported is English
## Dataset Structure
### Data Instances
This dataset is a mirror of the original ArXiv data. Because the full dataset is rather large (1.1TB and growing), this dataset provides only a metadata file in the json format. An example is given below
```
{'id': '0704.0002',
'submitter': 'Louis Theran',
'authors': 'Ileana Streinu and Louis Theran',
'title': 'Sparsity-certifying Graph Decompositions',
'comments': 'To appear in Graphs and Combinatorics',
'journal-ref': None,
'doi': None,
'report-no': None,
'categories': 'math.CO cs.CG',
'license': 'http://arxiv.org/licenses/nonexclusive-distrib/1.0/',
'abstract': ' We describe a new algorithm, the $(k,\\ell)$-pebble game with colors, and use\nit obtain a characterization of the family of $(k,\\ell)$-sparse graphs and\nalgorithmic solutions to a family of problems concerning tree decompositions of\ngraphs. Special instances of sparse graphs appear in rigidity theory and have\nreceived increased attention in recent years. In particular, our colored\npebbles generalize and strengthen the previous results of Lee and Streinu and\ngive a new proof of the Tutte-Nash-Williams characterization of arboricity. We\nalso present a new decomposition that certifies sparsity based on the\n$(k,\\ell)$-pebble game with colors. Our work also exposes connections between\npebble game algorithms and previous sparse graph algorithms by Gabow, Gabow and\nWestermann and Hendrickson.\n',
'update_date': '2008-12-13'}
```
### Data Fields
- `id`: ArXiv ID (can be used to access the paper)
- `submitter`: Who submitted the paper
- `authors`: Authors of the paper
- `title`: Title of the paper
- `comments`: Additional info, such as number of pages and figures
- `journal-ref`: Information about the journal the paper was published in
- `doi`: [Digital Object Identifier](https://www.doi.org)
- `report-no`: Report Number
- `abstract`: The abstract of the paper
- `categories`: Categories / tags in the ArXiv system
### Data Splits
The data was not splited.
## Dataset Creation
### Curation Rationale
For nearly 30 years, ArXiv has served the public and research communities by providing open access to scholarly articles, from the vast branches of physics to the many subdisciplines of computer science to everything in between, including math, statistics, electrical engineering, quantitative biology, and economics. This rich corpus of information offers significant, but sometimes overwhelming depth. In these times of unique global challenges, efficient extraction of insights from data is essential. To help make the arXiv more accessible, a free, open pipeline on Kaggle to the machine-readable arXiv dataset: a repository of 1.7 million articles, with relevant features such as article titles, authors, categories, abstracts, full text PDFs, and more is presented to empower new use cases that can lead to the exploration of richer machine learning techniques that combine multi-modal features towards applications like trend analysis, paper recommender engines, category prediction, co-citation networks, knowledge graph construction and semantic search interfaces.
### Source Data
This data is based on arXiv papers.
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
This dataset contains no annotations.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
The original data is maintained by [ArXiv](https://arxiv.org/)
### Licensing Information
The data is under the [Creative Commons CC0 1.0 Universal Public Domain Dedication](https://creativecommons.org/publicdomain/zero/1.0/)
### Citation Information
```
@misc{clement2019arxiv,
title={On the Use of ArXiv as a Dataset},
author={Colin B. Clement and Matthew Bierbaum and Kevin P. O'Keeffe and Alexander A. Alemi},
year={2019},
eprint={1905.00075},
archivePrefix={arXiv},
primaryClass={cs.IR}
}
```
### Contributions
Thanks to [@tanmoyio](https://github.com/tanmoyio) for adding this dataset. |
CyberHarem/ise_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ise/伊勢/伊势 (Azur Lane)
This is the dataset of ise/伊勢/伊势 (Azur Lane), containing 11 images and their tags.
The core tags of this character are `animal_ears, breasts, fox_ears, red_hair, fox_tail, tail, hair_ornament, ponytail, bangs, large_breasts, long_hair, medium_breasts, red_eyes, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 14.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ise_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 8.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ise_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 23 | 15.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ise_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 12.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ise_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 23 | 24.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ise_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ise_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, solo, midriff, navel, smile, cleavage, fingerless_gloves, hakama_skirt, simple_background, collarbone, black_gloves, full_body, standing, sword, white_background, detached_sleeves, hip_vent, holding_weapon |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | midriff | navel | smile | cleavage | fingerless_gloves | hakama_skirt | simple_background | collarbone | black_gloves | full_body | standing | sword | white_background | detached_sleeves | hip_vent | holding_weapon |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:----------|:--------|:--------|:-----------|:--------------------|:---------------|:--------------------|:-------------|:---------------|:------------|:-----------|:--------|:-------------------|:-------------------|:-----------|:-----------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Luciya/llama-2-nuv-intent-big-multi | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 862786
num_examples: 1563
download_size: 132778
dataset_size: 862786
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama-2-nuv-intent-big-multi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
djmix/transitions | ---
dataset_info:
features:
- name: tran_id
dtype: string
- name: mix_id
dtype: string
- name: i_tran
dtype: int32
- name: i_track_prev
dtype: int32
- name: i_track_next
dtype: int32
- name: track_id_prev
dtype: string
- name: track_id_next
dtype: string
- name: match_rate_prev
dtype: float32
- name: match_rate_next
dtype: float32
- name: matched_beats_prev
dtype: int32
- name: matched_beats_next
dtype: int32
- name: overlap_wpts
dtype: int32
- name: overlap_beats
dtype: float32
- name: tran_wpts
dtype: int32
- name: extra_wpts_prev
dtype: int32
- name: extra_wpts_next
dtype: int32
- name: extra_beats_prev
dtype: float32
- name: extra_beats_next
dtype: float32
- name: last_wpt_prev
dtype: int32
- name: last_wpt_next
dtype: int32
- name: total_wpt_prev
dtype: int32
- name: total_wpt_next
dtype: int32
- name: matched_time_mix_prev
dtype: float32
- name: matched_time_mix_next
dtype: float32
- name: matched_time_track_prev
dtype: float32
- name: matched_time_track_next
dtype: float32
- name: timestamp_prev
dtype: float32
- name: timestamp_next
dtype: float32
- name: case_name_prev
dtype: string
- name: case_name_next
dtype: string
- name: feature_prev
dtype: string
- name: feature_next
dtype: string
- name: metric_prev
dtype: string
- name: metric_next
dtype: string
- name: key_change_prev
dtype: int32
- name: key_change_next
dtype: int32
- name: mix_cue_in_beat_prev
dtype: int32
- name: mix_cue_in_beat_next
dtype: int32
- name: mix_cue_out_beat_prev
dtype: int32
- name: mix_cue_out_beat_next
dtype: int32
- name: track_cue_in_beat_prev
dtype: int32
- name: track_cue_in_beat_next
dtype: int32
- name: track_cue_out_beat_prev
dtype: int32
- name: track_cue_out_beat_next
dtype: int32
- name: mix_cue_in_time_prev
dtype: float32
- name: mix_cue_in_time_next
dtype: float32
- name: mix_cue_out_time_prev
dtype: float32
- name: mix_cue_out_time_next
dtype: float32
- name: track_cue_in_time_prev
dtype: float32
- name: track_cue_in_time_next
dtype: float32
- name: track_cue_out_time_prev
dtype: float32
- name: track_cue_out_time_next
dtype: float32
- name: cost_prev
dtype: float32
- name: cost_next
dtype: float32
- name: wp_prev
sequence:
sequence: int32
- name: wp_next
sequence:
sequence: int32
- name: wp_raw_prev
sequence:
sequence: int32
- name: wp_raw_next
sequence:
sequence: int32
splits:
- name: train
num_bytes: 3980668452
num_examples: 64748
download_size: 1355715395
dataset_size: 3980668452
---
# Dataset Card for "transitions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ricahrd/Pedrinho | ---
license: openrail
---
|
prakharrishi11j/newsCategorization | ---
license: llama2
---
|
punwaiw/DiffusionJockey | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 13316574561.25
num_examples: 17110
download_size: 13312875795
dataset_size: 13316574561.25
---
# Dataset Card for "DiffusionJockey"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/OxfordPets_test_facebook_opt_350m_Visclues_ns_3669 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_0_bs_16
num_bytes: 121460915.375
num_examples: 3669
- name: fewshot_1_bs_16
num_bytes: 122822636.375
num_examples: 3669
- name: fewshot_3_bs_16
num_bytes: 125537076.375
num_examples: 3669
- name: fewshot_5_bs_16
num_bytes: 128243735.375
num_examples: 3669
- name: fewshot_8_bs_16
num_bytes: 132312128.375
num_examples: 3669
download_size: 604694442
dataset_size: 630376491.875
---
# Dataset Card for "OxfordPets_test_facebook_opt_350m_Visclues_ns_3669"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Zicara/Hands_11k | ---
license: unknown
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.