datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
bot-yaya/undl_ar2en_aligned | ---
dataset_info:
features:
- name: record
dtype: string
- name: clean_para_index_set_pair
dtype: string
- name: src
dtype: string
- name: dst
dtype: string
- name: src_text
dtype: string
- name: dst_text
dtype: string
- name: src_rate
dtype: float64
- name: dst_rate
dtype: float64
splits:
- name: train
num_bytes: 12012712129
num_examples: 15217906
download_size: 0
dataset_size: 12012712129
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "undl_ar2en_aligned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/martha_santa_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of martha_santa/マルタ〔サンタ〕/玛尔达〔圣诞〕 (Fate/Grand Order)
This is the dataset of martha_santa/マルタ〔サンタ〕/玛尔达〔圣诞〕 (Fate/Grand Order), containing 64 images and their tags.
The core tags of this character are `purple_hair, long_hair, blue_eyes, hat, red_headwear, santa_hat, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 64 | 61.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/martha_santa_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 64 | 59.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/martha_santa_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 143 | 109.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/martha_santa_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/martha_santa_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 64 |  |  |  |  |  | 1girl, long_sleeves, smile, solo, brown_shirt, blush, christmas, looking_at_viewer, mittens, white_apron, red_skirt, fur_trim, brooch, open_mouth, off_shoulder, belt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | smile | solo | brown_shirt | blush | christmas | looking_at_viewer | mittens | white_apron | red_skirt | fur_trim | brooch | open_mouth | off_shoulder | belt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------|:-------|:--------------|:--------|:------------|:--------------------|:----------|:--------------|:------------|:-----------|:---------|:-------------|:---------------|:-------|
| 0 | 64 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
ZHLiu627/ultrafeedback_binarized_with_response_full_part1 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
- name: reference_response
dtype: string
splits:
- name: train_prefs
num_bytes: 167825271
num_examples: 20000
download_size: 93223431
dataset_size: 167825271
configs:
- config_name: default
data_files:
- split: train_prefs
path: data/train_prefs-*
---
# Dataset Card for "ultrafeedback_binarized_with_response_full_part1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sekarmulyani/ulasan-ecommerce-classification | ---
license: apache-2.0
task_categories:
- text-classification
language:
- id
size_categories:
- 100K<n<1M
--- |
johannes-garstenauer/PerformanceTest | ---
license: apache-2.0
---
|
rajistics/electricity_demand | ---
task_categories:
- time-series-forecasting
---
The Victoria electricity demand dataset from the [MAPIE github repository](https://github.com/scikit-learn-contrib/MAPIE/tree/master/examples/data).
It consists of hourly electricity demand (in GW)
of the Victoria state in Australia together with the temperature
(in Celsius degrees).
|
Multimodal-Fatima/Caltech101_not_background_test_facebook_opt_125m_Attributes_ns_5647 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_0_bs_16
num_bytes: 84088557.125
num_examples: 5647
- name: fewshot_1_bs_16
num_bytes: 85276022.125
num_examples: 5647
- name: fewshot_3_bs_16
num_bytes: 87656291.125
num_examples: 5647
- name: fewshot_5_bs_16
num_bytes: 90034037.125
num_examples: 5647
- name: fewshot_8_bs_16
num_bytes: 93580093.125
num_examples: 5647
download_size: 415553691
dataset_size: 440635000.625
---
# Dataset Card for "Caltech101_not_background_test_facebook_opt_125m_Attributes_ns_5647"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_jingyeom__SOLAR_KO_1.3_deup | ---
pretty_name: Evaluation run of jingyeom/SOLAR_KO_1.3_deup
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jingyeom/SOLAR_KO_1.3_deup](https://huggingface.co/jingyeom/SOLAR_KO_1.3_deup)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jingyeom__SOLAR_KO_1.3_deup\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-17T00:23:55.496430](https://huggingface.co/datasets/open-llm-leaderboard/details_jingyeom__SOLAR_KO_1.3_deup/blob/main/results_2024-01-17T00-23-55.496430.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5568308436610663,\n\
\ \"acc_stderr\": 0.03382759863491837,\n \"acc_norm\": 0.562882955720715,\n\
\ \"acc_norm_stderr\": 0.03456146092708182,\n \"mc1\": 0.3182374541003672,\n\
\ \"mc1_stderr\": 0.016305988648920623,\n \"mc2\": 0.4754562707057089,\n\
\ \"mc2_stderr\": 0.01501819768286651\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5238907849829352,\n \"acc_stderr\": 0.014594701798071654,\n\
\ \"acc_norm\": 0.5597269624573379,\n \"acc_norm_stderr\": 0.014506769524804234\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5974905397331209,\n\
\ \"acc_stderr\": 0.004894012555642646,\n \"acc_norm\": 0.7997410874327823,\n\
\ \"acc_norm_stderr\": 0.003993761698847879\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296564,\n\
\ \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296564\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n\
\ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n\
\ \"acc_stderr\": 0.03784271932887468,\n \"acc_norm\": 0.5606936416184971,\n\
\ \"acc_norm_stderr\": 0.03784271932887468\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36243386243386244,\n \"acc_stderr\": 0.02475747390275206,\n \"\
acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.02475747390275206\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.043902592653775614,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.043902592653775614\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.632258064516129,\n \"acc_stderr\": 0.02743086657997347,\n \"acc_norm\"\
: 0.632258064516129,\n \"acc_norm_stderr\": 0.02743086657997347\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3793103448275862,\n\
\ \"acc_stderr\": 0.03413963805906235,\n \"acc_norm\": 0.3793103448275862,\n\
\ \"acc_norm_stderr\": 0.03413963805906235\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"\
acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n \
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7070707070707071,\n \"acc_stderr\": 0.03242497958178816,\n \"\
acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.03242497958178816\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.030031147977641538,\n\
\ \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.030031147977641538\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5051282051282051,\n \"acc_stderr\": 0.025349672906838653,\n\
\ \"acc_norm\": 0.5051282051282051,\n \"acc_norm_stderr\": 0.025349672906838653\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.03218358107742613,\n \
\ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.03218358107742613\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7504587155963303,\n \"acc_stderr\": 0.018553897629501624,\n \"\
acc_norm\": 0.7504587155963303,\n \"acc_norm_stderr\": 0.018553897629501624\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.03077855467869327,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.03077855467869327\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6962025316455697,\n \"acc_stderr\": 0.02993669638713861,\n \
\ \"acc_norm\": 0.6962025316455697,\n \"acc_norm_stderr\": 0.02993669638713861\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870255,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870255\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884122,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884122\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470021,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470021\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.024414947304543678,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.024414947304543678\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n\
\ \"acc_stderr\": 0.015246803197398687,\n \"acc_norm\": 0.7611749680715197,\n\
\ \"acc_norm_stderr\": 0.015246803197398687\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5867052023121387,\n \"acc_stderr\": 0.026511261369409247,\n\
\ \"acc_norm\": 0.5867052023121387,\n \"acc_norm_stderr\": 0.026511261369409247\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2759776536312849,\n\
\ \"acc_stderr\": 0.014950103002475365,\n \"acc_norm\": 0.2759776536312849,\n\
\ \"acc_norm_stderr\": 0.014950103002475365\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.027684181883302877,\n\
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.027684181883302877\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n\
\ \"acc_stderr\": 0.02679542232789393,\n \"acc_norm\": 0.6655948553054662,\n\
\ \"acc_norm_stderr\": 0.02679542232789393\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.02646248777700187,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.02646248777700187\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480618,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480618\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4211212516297262,\n\
\ \"acc_stderr\": 0.012610325733489905,\n \"acc_norm\": 0.4211212516297262,\n\
\ \"acc_norm_stderr\": 0.012610325733489905\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5330882352941176,\n \"acc_stderr\": 0.030306257722468317,\n\
\ \"acc_norm\": 0.5330882352941176,\n \"acc_norm_stderr\": 0.030306257722468317\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5441176470588235,\n \"acc_stderr\": 0.020148939420415745,\n \
\ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.020148939420415745\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726492,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726492\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208955,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208955\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n\
\ \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3182374541003672,\n\
\ \"mc1_stderr\": 0.016305988648920623,\n \"mc2\": 0.4754562707057089,\n\
\ \"mc2_stderr\": 0.01501819768286651\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.01185004012485051\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2259287338893101,\n \
\ \"acc_stderr\": 0.01151909877727995\n }\n}\n```"
repo_url: https://huggingface.co/jingyeom/SOLAR_KO_1.3_deup
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|arc:challenge|25_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|gsm8k|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hellaswag|10_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T00-23-55.496430.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T00-23-55.496430.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- '**/details_harness|winogrande|5_2024-01-17T00-23-55.496430.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-17T00-23-55.496430.parquet'
- config_name: results
data_files:
- split: 2024_01_17T00_23_55.496430
path:
- results_2024-01-17T00-23-55.496430.parquet
- split: latest
path:
- results_2024-01-17T00-23-55.496430.parquet
---
# Dataset Card for Evaluation run of jingyeom/SOLAR_KO_1.3_deup
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jingyeom/SOLAR_KO_1.3_deup](https://huggingface.co/jingyeom/SOLAR_KO_1.3_deup) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jingyeom__SOLAR_KO_1.3_deup",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T00:23:55.496430](https://huggingface.co/datasets/open-llm-leaderboard/details_jingyeom__SOLAR_KO_1.3_deup/blob/main/results_2024-01-17T00-23-55.496430.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5568308436610663,
"acc_stderr": 0.03382759863491837,
"acc_norm": 0.562882955720715,
"acc_norm_stderr": 0.03456146092708182,
"mc1": 0.3182374541003672,
"mc1_stderr": 0.016305988648920623,
"mc2": 0.4754562707057089,
"mc2_stderr": 0.01501819768286651
},
"harness|arc:challenge|25": {
"acc": 0.5238907849829352,
"acc_stderr": 0.014594701798071654,
"acc_norm": 0.5597269624573379,
"acc_norm_stderr": 0.014506769524804234
},
"harness|hellaswag|10": {
"acc": 0.5974905397331209,
"acc_stderr": 0.004894012555642646,
"acc_norm": 0.7997410874327823,
"acc_norm_stderr": 0.003993761698847879
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.04026097083296564,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.04026097083296564
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.030242233800854494,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.030242233800854494
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.03784271932887468,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.03784271932887468
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.02475747390275206,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.02475747390275206
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.043902592653775614,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.043902592653775614
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.632258064516129,
"acc_stderr": 0.02743086657997347,
"acc_norm": 0.632258064516129,
"acc_norm_stderr": 0.02743086657997347
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.03413963805906235,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.03413963805906235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7070707070707071,
"acc_stderr": 0.03242497958178816,
"acc_norm": 0.7070707070707071,
"acc_norm_stderr": 0.03242497958178816
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7772020725388601,
"acc_stderr": 0.030031147977641538,
"acc_norm": 0.7772020725388601,
"acc_norm_stderr": 0.030031147977641538
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5051282051282051,
"acc_stderr": 0.025349672906838653,
"acc_norm": 0.5051282051282051,
"acc_norm_stderr": 0.025349672906838653
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253252,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253252
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.03218358107742613,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.03218358107742613
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7504587155963303,
"acc_stderr": 0.018553897629501624,
"acc_norm": 0.7504587155963303,
"acc_norm_stderr": 0.018553897629501624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.03077855467869327,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.03077855467869327
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6962025316455697,
"acc_stderr": 0.02993669638713861,
"acc_norm": 0.6962025316455697,
"acc_norm_stderr": 0.02993669638713861
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870255,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870255
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884122,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884122
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04668408033024931,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04668408033024931
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.03746668325470021,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.03746668325470021
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543678,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543678
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7611749680715197,
"acc_stderr": 0.015246803197398687,
"acc_norm": 0.7611749680715197,
"acc_norm_stderr": 0.015246803197398687
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5867052023121387,
"acc_stderr": 0.026511261369409247,
"acc_norm": 0.5867052023121387,
"acc_norm_stderr": 0.026511261369409247
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2759776536312849,
"acc_stderr": 0.014950103002475365,
"acc_norm": 0.2759776536312849,
"acc_norm_stderr": 0.014950103002475365
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.027684181883302877,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.027684181883302877
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6655948553054662,
"acc_stderr": 0.02679542232789393,
"acc_norm": 0.6655948553054662,
"acc_norm_stderr": 0.02679542232789393
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.02646248777700187,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.02646248777700187
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4211212516297262,
"acc_stderr": 0.012610325733489905,
"acc_norm": 0.4211212516297262,
"acc_norm_stderr": 0.012610325733489905
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5330882352941176,
"acc_stderr": 0.030306257722468317,
"acc_norm": 0.5330882352941176,
"acc_norm_stderr": 0.030306257722468317
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.020148939420415745,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.020148939420415745
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726492,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208955,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208955
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7426900584795322,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.7426900584795322,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3182374541003672,
"mc1_stderr": 0.016305988648920623,
"mc2": 0.4754562707057089,
"mc2_stderr": 0.01501819768286651
},
"harness|winogrande|5": {
"acc": 0.7687450670876085,
"acc_stderr": 0.01185004012485051
},
"harness|gsm8k|5": {
"acc": 0.2259287338893101,
"acc_stderr": 0.01151909877727995
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
DJBanzin/Vozesvukvuk | ---
license: openrail
---
|
dar-tau/lm-extraction-benchmark | ---
dataset_info:
features:
- name: preprefix
sequence: uint16
- name: prefix
sequence: uint16
- name: suffix
sequence: uint16
splits:
- name: train
num_bytes: 6180000
num_examples: 15000
download_size: 5792506
dataset_size: 6180000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
erikliu18/us-congress-hearing | ---
task_categories:
- text-classification
language:
- en
tags:
- finance
- legal
---
# U.S. Congressional Hearings Dataset
This dataset currently contains cleaned sentences from all House Committee on Energy and Commerce hearings from 2002.
A total of 1K+ hearing transcripts in txt formats from govinfo.gov were collected and cleaned. |
feliipert/Reportes-radiologicos | ---
license: apache-2.0
task_categories:
- text-classification
language:
- es
tags:
- medical
pretty_name: Radiologist
size_categories:
- n<1K
--- |
mnazari/nena_speech_1_0_test | ---
pretty_name: NENA Speech Dataset 1.0 (test)
annotations_creators:
- crowdsourced
- Geoffrey Khan
language_creators:
- crowdsourced
language:
- aii
- cld
- huy
- lsd
- trg
- aij
- bhn
- hrt
- kqd
- syn
license:
- cc0-1.0
multilinguality:
- multilingual
task_categories:
- automatic-speech-recognition
- text-to-speech
- translation
size_categories:
- 10K<n<100K
- 1K<n<10K
- n<1K
---
# Dataset Card for NENA Speech Dataset 1.0 (test)
## Table of Contents
- [Dataset Summary](#dataset-summary)
- [Dataset Description](#dataset-description)
- [Languages](#languages)
- [How to Use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
<!-- - [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations) -->
- [Building the Dataset](#building-the-dataset)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
<!-- - [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations) -->
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## ⚠️ This is a temperary repository that will be replaced by end of 2023
## Dataset Summary
NENA Speech is a multimodal dataset to help teach machines how real people speak the Northeastern Neo-Aramaic (NENA) dialects.
The NENA dialects form a very diverse group of Aramaic dialects spoken by Christian and Jewish communities indigenous to northwestern Iran, northern Iraq, and southeastern Türkiye.
NENA Speech consists of multimodal examples of speech of the NENA dialects. While all documented NENA dialects are included, not all have data yet, and some will never due to recent loss of their final speakers.
## Dataset Description
- **Homepage**: https://crowdsource.nenadb.dev/
- **Point of Contact:** [Matthew Nazari](mailto:matthewnazari@college.harvard.edu)
## Languages
The NENA dialects form a very diverse group of Aramaic dialects spoken by Christian and Jewish communities indigenous to northwestern Iran, northern Iraq, and southeastern Türkiye.
Speakers of the Christian dialects call their language Assyrian and Chaldean in English. In their language these speakers use multiple different terms (e.g. suráy, sureth, ḥadiṯan, senaya). Speakers of the Jewish dialects call their language lišana deni, lišanət noshan, lišana nosha, lišana didan, all meaning "our language". Some names reflect the consciousness of it being a specifically Jewish language (e.g. lišan hozaye, hulaula).
NENA Speech has a subset for all of the over 150 NENA dialects. Not all dialects have examples available yet. Some dialects will never have examples available due to the loss of their final speakers in recent years.
## How to Use
The `datasets` library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the `load_dataset` function.
For example, simply specify the corresponding language config name (e.g., "urmi (christian)" for the dialect of the Assyrian Christians of Urmi):
```python
from datasets import load_dataset
nena_speech = load_dataset("mnazari/nena_speech_1_0_test", "urmi (christian)", split="train")
```
To find out more about loading and preparing audio datasets, head over to [hf.co/blog/audio-datasets](https://huggingface.co/blog/audio-datasets).
## Dataset Structure
### Data Instances
The NENA Speech dataset is a multimodal dataset that consists of three different kinds of examples:
1. **Unlabeled speech examples:** these contain audio of speech (`audio`) but no accompanying transcription (`transcription`) or translation (`translation`). This is useful for representation learning.
2. **Transcribed speech examples:** these contain both audio and transcription of speech. These are useful for machine learning tasks like automatic speech recognition and speech synthesis.
3. **Transcribed and translated speech examples:** these kinds of examples contain audio, transcription, and translation of speech. These are useful for tasks like multimodal translation.
Make sure to filter for the kinds of examples you need for your task before before using it.
```json
{
"transcription": "gu-mdìta.ˈ",
"translation": "in the town.",
"audio": {
"path": "et/train/nena_speech_0uk14ofpom196aj.mp3",
"array": array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32),
"sampling_rate": 48000
},
"locale": "IRN",
"proficiency": "proficient as mom",
"age": "70's",
"crowdsourced": true,
"unlabeled": true,
"interrupted": true,
"client_id": "gwurt1g1ln" ,
"path": "et/train/nena_speech_0uk14ofpom196aj.mp3",
}
```
### Data Fields
- `transcription (string)`: The transcription of what was spoken (e.g. `"beta"`)
- `translation (string)`: The translation of what was spoken in English (e.g. `"house"`)
- `audio (dict)`: A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the "audio" column, i.e. `dataset[0]["audio"]` should always be preferred over `dataset["audio"][0]`.
- `locale (string)`: The locale of the speaker
- `proficiency (string)`: The proficiency of the speaker
- `age (string)`: The age of the speaker (e.g. `"20's"`, `"50's"`, `"100+"`)
- `crowdsourced (bool)`: Indicates whether the example was crowdsourced as opposed to collected from existing language documentation resources
- `interrupted (bool)`: Indicates whether the example was interrupted with the speaker making sound effects or switching into another language
- `client_id (string)`: An id for which client (voice) made the recording
- `path (string)`: The path to the audio file
### Data Splits
The examples have been subdivided into three portions:
1. **dev:** the validation split (10%)
3. **test:** the test split (10%)
2. **train:** the train split (80%)
The dev, test, train are all data that has been reviewed, deemed of high quality and split into dev, test and train.
## Dataset Creation
<!-- ### Curation Rationale
[Needs More Information]
### Source Data
#### Language Documentation Resources
[Needs More Information]
#### Webscraping Facebook
[Needs More Information]
#### Crowdsourcing
[Needs More Information]
### Annotations
[Needs More Information] -->
### Building the Dataset
The NENA Speech dataset itself is built using `build.py`.
First, install the necessary requirements.
```
pip install -r requirements.txt
```
Next, build the dataset.
```
python build.py --build
```
Finally, push to the HuggingFace dataset repository.
## Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
## Data Preprocessing
The dataset consists of three different kinds of examples (see [Data Instances](#data-instances)).
Make sure to filter for the kinds of examples you need for your task before before using it. For example, for automatic speech recognition you will want to filter for examples with transcriptions.
In most tasks, you will want to filter out examples that are interrupted (e.g. by the speaker making sound effects, by the speaker switching into a another language).
```python
from datasets import load_dataset
ds = load_dataset("mnazari/nena_speech_1_0_test", "urmi (christian)", split="train")
def filter_for_asr(example):
return example['transcription'] and not example['interrupted']
ds = ds.filter(filter_for_asr, desc="filter dataset")
```
Transcriptions include markers of linguistic and acoustic features which may be removed in certain tasks (e.g. word stress, nuclear stress, intonation group markers, vowel length).
```python
from datasets import load_dataset
ds = load_dataset("mnazari/nena_speech_1_0_test", "urmi (christian)", split="train")
def prepare_dataset(batch):
chars_to_remove = ['ˈ', '̀', '́', '̄', '̆', '.', ',', '?', '!']
for char in chars_to_remove:
batch["transcription"] = batch["transcription"].replace(char, "")
return batch
ds = ds.map(prepare_dataset, desc="preprocess dataset")
```
<!-- ## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information] -->
## Additional Information
### Licensing Information
Public Domain, [CC-0](https://creativecommons.org/share-your-work/public-domain/cc0/).
### Citation Information
This work has not yet been published.
|
andersonbcdefg/doc_nli_pos_pairs | ---
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
splits:
- name: train
num_bytes: 888275454
num_examples: 528671
download_size: 467347853
dataset_size: 888275454
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
happycute123/DL_dataset | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 679382539.464
num_examples: 1057
- name: test
num_bytes: 167054773.0
num_examples: 264
download_size: 0
dataset_size: 846437312.464
---
# Dataset Card for "DL_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xorsuyash/raft_datasetp1 | ---
license: mit
---
|
FDeRubeis/araft | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: label
dtype: string
- name: prediction
dtype: string
- name: trajectory
dtype: string
splits:
- name: train
num_bytes: 961824
num_examples: 413
download_size: 499573
dataset_size: 961824
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_uukuguy__GDC-Tiny-L1-1.8B | ---
pretty_name: Evaluation run of uukuguy/GDC-Tiny-L1-1.8B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/GDC-Tiny-L1-1.8B](https://huggingface.co/uukuguy/GDC-Tiny-L1-1.8B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__GDC-Tiny-L1-1.8B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-07T19:08:55.303427](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__GDC-Tiny-L1-1.8B/blob/main/results_2024-04-07T19-08-55.303427.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.43818164678489546,\n\
\ \"acc_stderr\": 0.03446376592395749,\n \"acc_norm\": 0.44061901287311,\n\
\ \"acc_norm_stderr\": 0.03519031730786029,\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.015345409485557989,\n \"mc2\": 0.40282806404013544,\n\
\ \"mc2_stderr\": 0.014486104033756\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3438566552901024,\n \"acc_stderr\": 0.01388064457015621,\n\
\ \"acc_norm\": 0.3651877133105802,\n \"acc_norm_stderr\": 0.014070265519268804\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4400517825134435,\n\
\ \"acc_stderr\": 0.004953787146510927,\n \"acc_norm\": 0.5866361282613025,\n\
\ \"acc_norm_stderr\": 0.004914305798575695\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720685,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720685\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.362962962962963,\n\
\ \"acc_stderr\": 0.04153948404742399,\n \"acc_norm\": 0.362962962962963,\n\
\ \"acc_norm_stderr\": 0.04153948404742399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4528301886792453,\n \"acc_stderr\": 0.030635627957961816,\n\
\ \"acc_norm\": 0.4528301886792453,\n \"acc_norm_stderr\": 0.030635627957961816\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4305555555555556,\n\
\ \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.4305555555555556,\n\
\ \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\"\
: 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3930635838150289,\n\
\ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.3930635838150289,\n\
\ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179964,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179964\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537314,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537314\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3544973544973545,\n \"acc_stderr\": 0.024636830602842,\n \"acc_norm\"\
: 0.3544973544973545,\n \"acc_norm_stderr\": 0.024636830602842\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.0393253768039287,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.0393253768039287\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.49032258064516127,\n \"acc_stderr\": 0.028438677998909558,\n \"\
acc_norm\": 0.49032258064516127,\n \"acc_norm_stderr\": 0.028438677998909558\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3103448275862069,\n \"acc_stderr\": 0.03255086769970103,\n \"\
acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03255086769970103\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.03851716319398395,\n\
\ \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.03851716319398395\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5959595959595959,\n \"acc_stderr\": 0.03496130972056127,\n \"\
acc_norm\": 0.5959595959595959,\n \"acc_norm_stderr\": 0.03496130972056127\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5440414507772021,\n \"acc_stderr\": 0.03594413711272437,\n\
\ \"acc_norm\": 0.5440414507772021,\n \"acc_norm_stderr\": 0.03594413711272437\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.37948717948717947,\n \"acc_stderr\": 0.024603626924097417,\n\
\ \"acc_norm\": 0.37948717948717947,\n \"acc_norm_stderr\": 0.024603626924097417\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228412,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228412\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5596330275229358,\n \"acc_stderr\": 0.02128431062376155,\n \"\
acc_norm\": 0.5596330275229358,\n \"acc_norm_stderr\": 0.02128431062376155\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.25462962962962965,\n \"acc_stderr\": 0.029711275860005354,\n \"\
acc_norm\": 0.25462962962962965,\n \"acc_norm_stderr\": 0.029711275860005354\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4852941176470588,\n \"acc_stderr\": 0.035077938347913236,\n \"\
acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.035077938347913236\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5527426160337553,\n \"acc_stderr\": 0.03236564251614192,\n \
\ \"acc_norm\": 0.5527426160337553,\n \"acc_norm_stderr\": 0.03236564251614192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5291479820627802,\n\
\ \"acc_stderr\": 0.03350073248773404,\n \"acc_norm\": 0.5291479820627802,\n\
\ \"acc_norm_stderr\": 0.03350073248773404\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4961832061068702,\n \"acc_stderr\": 0.04385162325601553,\n\
\ \"acc_norm\": 0.4961832061068702,\n \"acc_norm_stderr\": 0.04385162325601553\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968431,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968431\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.32515337423312884,\n \"acc_stderr\": 0.036803503712864595,\n\
\ \"acc_norm\": 0.32515337423312884,\n \"acc_norm_stderr\": 0.036803503712864595\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5825242718446602,\n \"acc_stderr\": 0.048828405482122375,\n\
\ \"acc_norm\": 0.5825242718446602,\n \"acc_norm_stderr\": 0.048828405482122375\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7606837606837606,\n\
\ \"acc_stderr\": 0.027951826808924336,\n \"acc_norm\": 0.7606837606837606,\n\
\ \"acc_norm_stderr\": 0.027951826808924336\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5644955300127714,\n\
\ \"acc_stderr\": 0.017730589927926595,\n \"acc_norm\": 0.5644955300127714,\n\
\ \"acc_norm_stderr\": 0.017730589927926595\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.49710982658959535,\n \"acc_stderr\": 0.02691864538323901,\n\
\ \"acc_norm\": 0.49710982658959535,\n \"acc_norm_stderr\": 0.02691864538323901\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23128491620111732,\n\
\ \"acc_stderr\": 0.014102223623152586,\n \"acc_norm\": 0.23128491620111732,\n\
\ \"acc_norm_stderr\": 0.014102223623152586\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.028607893699576063,\n\
\ \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.028607893699576063\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4340836012861736,\n\
\ \"acc_stderr\": 0.0281502322445356,\n \"acc_norm\": 0.4340836012861736,\n\
\ \"acc_norm_stderr\": 0.0281502322445356\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.45987654320987653,\n \"acc_stderr\": 0.027731022753539274,\n\
\ \"acc_norm\": 0.45987654320987653,\n \"acc_norm_stderr\": 0.027731022753539274\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611317,\n \
\ \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611317\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34810951760104303,\n\
\ \"acc_stderr\": 0.012166738993698195,\n \"acc_norm\": 0.34810951760104303,\n\
\ \"acc_norm_stderr\": 0.012166738993698195\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.34191176470588236,\n \"acc_stderr\": 0.028814722422254187,\n\
\ \"acc_norm\": 0.34191176470588236,\n \"acc_norm_stderr\": 0.028814722422254187\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4199346405228758,\n \"acc_stderr\": 0.019966811178256483,\n \
\ \"acc_norm\": 0.4199346405228758,\n \"acc_norm_stderr\": 0.019966811178256483\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.45714285714285713,\n \"acc_stderr\": 0.03189141832421397,\n\
\ \"acc_norm\": 0.45714285714285713,\n \"acc_norm_stderr\": 0.03189141832421397\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5373134328358209,\n\
\ \"acc_stderr\": 0.03525675167467974,\n \"acc_norm\": 0.5373134328358209,\n\
\ \"acc_norm_stderr\": 0.03525675167467974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\
\ \"acc_stderr\": 0.03799857454479636,\n \"acc_norm\": 0.39156626506024095,\n\
\ \"acc_norm_stderr\": 0.03799857454479636\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.52046783625731,\n \"acc_stderr\": 0.0383161053282193,\n\
\ \"acc_norm\": 0.52046783625731,\n \"acc_norm_stderr\": 0.0383161053282193\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2594859241126071,\n\
\ \"mc1_stderr\": 0.015345409485557989,\n \"mc2\": 0.40282806404013544,\n\
\ \"mc2_stderr\": 0.014486104033756\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6101026045777427,\n \"acc_stderr\": 0.013707547317008465\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.29037149355572406,\n \
\ \"acc_stderr\": 0.01250359248181895\n }\n}\n```"
repo_url: https://huggingface.co/uukuguy/GDC-Tiny-L1-1.8B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|arc:challenge|25_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|arc:challenge|25_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|arc:challenge|25_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|arc:challenge|25_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|gsm8k|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|gsm8k|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|gsm8k|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|gsm8k|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hellaswag|10_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hellaswag|10_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hellaswag|10_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hellaswag|10_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T00-21-15.731330.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T15-47-54.200639.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T05-02-03.235353.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T19-08-55.303427.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T19-08-55.303427.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- '**/details_harness|winogrande|5_2024-04-06T00-21-15.731330.parquet'
- split: 2024_04_06T15_47_54.200639
path:
- '**/details_harness|winogrande|5_2024-04-06T15-47-54.200639.parquet'
- split: 2024_04_07T05_02_03.235353
path:
- '**/details_harness|winogrande|5_2024-04-07T05-02-03.235353.parquet'
- split: 2024_04_07T19_08_55.303427
path:
- '**/details_harness|winogrande|5_2024-04-07T19-08-55.303427.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-07T19-08-55.303427.parquet'
- config_name: results
data_files:
- split: 2024_04_06T00_21_15.731330
path:
- results_2024-04-06T00-21-15.731330.parquet
- split: 2024_04_06T15_47_54.200639
path:
- results_2024-04-06T15-47-54.200639.parquet
- split: 2024_04_07T05_02_03.235353
path:
- results_2024-04-07T05-02-03.235353.parquet
- split: 2024_04_07T19_08_55.303427
path:
- results_2024-04-07T19-08-55.303427.parquet
- split: latest
path:
- results_2024-04-07T19-08-55.303427.parquet
---
# Dataset Card for Evaluation run of uukuguy/GDC-Tiny-L1-1.8B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [uukuguy/GDC-Tiny-L1-1.8B](https://huggingface.co/uukuguy/GDC-Tiny-L1-1.8B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__GDC-Tiny-L1-1.8B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-07T19:08:55.303427](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__GDC-Tiny-L1-1.8B/blob/main/results_2024-04-07T19-08-55.303427.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.43818164678489546,
"acc_stderr": 0.03446376592395749,
"acc_norm": 0.44061901287311,
"acc_norm_stderr": 0.03519031730786029,
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557989,
"mc2": 0.40282806404013544,
"mc2_stderr": 0.014486104033756
},
"harness|arc:challenge|25": {
"acc": 0.3438566552901024,
"acc_stderr": 0.01388064457015621,
"acc_norm": 0.3651877133105802,
"acc_norm_stderr": 0.014070265519268804
},
"harness|hellaswag|10": {
"acc": 0.4400517825134435,
"acc_stderr": 0.004953787146510927,
"acc_norm": 0.5866361282613025,
"acc_norm_stderr": 0.004914305798575695
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.04153948404742399,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.04153948404742399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.46710526315789475,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.46710526315789475,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4528301886792453,
"acc_stderr": 0.030635627957961816,
"acc_norm": 0.4528301886792453,
"acc_norm_stderr": 0.030635627957961816
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3930635838150289,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.3930635838150289,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179964,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179964
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537314,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537314
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3544973544973545,
"acc_stderr": 0.024636830602842,
"acc_norm": 0.3544973544973545,
"acc_norm_stderr": 0.024636830602842
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.0393253768039287,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.0393253768039287
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.49032258064516127,
"acc_stderr": 0.028438677998909558,
"acc_norm": 0.49032258064516127,
"acc_norm_stderr": 0.028438677998909558
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.03255086769970103,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.03255086769970103
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.03851716319398395,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.03851716319398395
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5959595959595959,
"acc_stderr": 0.03496130972056127,
"acc_norm": 0.5959595959595959,
"acc_norm_stderr": 0.03496130972056127
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5440414507772021,
"acc_stderr": 0.03594413711272437,
"acc_norm": 0.5440414507772021,
"acc_norm_stderr": 0.03594413711272437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.37948717948717947,
"acc_stderr": 0.024603626924097417,
"acc_norm": 0.37948717948717947,
"acc_norm_stderr": 0.024603626924097417
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228412,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228412
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.035433042343899844,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.035433042343899844
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5596330275229358,
"acc_stderr": 0.02128431062376155,
"acc_norm": 0.5596330275229358,
"acc_norm_stderr": 0.02128431062376155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25462962962962965,
"acc_stderr": 0.029711275860005354,
"acc_norm": 0.25462962962962965,
"acc_norm_stderr": 0.029711275860005354
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.035077938347913236,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.035077938347913236
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5527426160337553,
"acc_stderr": 0.03236564251614192,
"acc_norm": 0.5527426160337553,
"acc_norm_stderr": 0.03236564251614192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5291479820627802,
"acc_stderr": 0.03350073248773404,
"acc_norm": 0.5291479820627802,
"acc_norm_stderr": 0.03350073248773404
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4961832061068702,
"acc_stderr": 0.04385162325601553,
"acc_norm": 0.4961832061068702,
"acc_norm_stderr": 0.04385162325601553
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968431,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968431
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.32515337423312884,
"acc_stderr": 0.036803503712864595,
"acc_norm": 0.32515337423312884,
"acc_norm_stderr": 0.036803503712864595
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.5825242718446602,
"acc_stderr": 0.048828405482122375,
"acc_norm": 0.5825242718446602,
"acc_norm_stderr": 0.048828405482122375
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7606837606837606,
"acc_stderr": 0.027951826808924336,
"acc_norm": 0.7606837606837606,
"acc_norm_stderr": 0.027951826808924336
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5644955300127714,
"acc_stderr": 0.017730589927926595,
"acc_norm": 0.5644955300127714,
"acc_norm_stderr": 0.017730589927926595
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.02691864538323901,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.02691864538323901
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23128491620111732,
"acc_stderr": 0.014102223623152586,
"acc_norm": 0.23128491620111732,
"acc_norm_stderr": 0.014102223623152586
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.028607893699576063,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.028607893699576063
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4340836012861736,
"acc_stderr": 0.0281502322445356,
"acc_norm": 0.4340836012861736,
"acc_norm_stderr": 0.0281502322445356
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.45987654320987653,
"acc_stderr": 0.027731022753539274,
"acc_norm": 0.45987654320987653,
"acc_norm_stderr": 0.027731022753539274
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611317,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611317
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34810951760104303,
"acc_stderr": 0.012166738993698195,
"acc_norm": 0.34810951760104303,
"acc_norm_stderr": 0.012166738993698195
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.34191176470588236,
"acc_stderr": 0.028814722422254187,
"acc_norm": 0.34191176470588236,
"acc_norm_stderr": 0.028814722422254187
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4199346405228758,
"acc_stderr": 0.019966811178256483,
"acc_norm": 0.4199346405228758,
"acc_norm_stderr": 0.019966811178256483
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.45714285714285713,
"acc_stderr": 0.03189141832421397,
"acc_norm": 0.45714285714285713,
"acc_norm_stderr": 0.03189141832421397
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5373134328358209,
"acc_stderr": 0.03525675167467974,
"acc_norm": 0.5373134328358209,
"acc_norm_stderr": 0.03525675167467974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.03799857454479636,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.03799857454479636
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.52046783625731,
"acc_stderr": 0.0383161053282193,
"acc_norm": 0.52046783625731,
"acc_norm_stderr": 0.0383161053282193
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2594859241126071,
"mc1_stderr": 0.015345409485557989,
"mc2": 0.40282806404013544,
"mc2_stderr": 0.014486104033756
},
"harness|winogrande|5": {
"acc": 0.6101026045777427,
"acc_stderr": 0.013707547317008465
},
"harness|gsm8k|5": {
"acc": 0.29037149355572406,
"acc_stderr": 0.01250359248181895
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_rte_completive_done | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 481976
num_examples: 1156
- name: train
num_bytes: 421696
num_examples: 962
download_size: 578766
dataset_size: 903672
---
# Dataset Card for "MULTI_VALUE_rte_completive_done"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
toilaluan/t2i_reward_v3 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: model_type
dtype: string
- name: request_id
dtype: int64
- name: topic
dtype: string
- name: reward
dtype: float64
- name: individual_rewards
struct:
- name: image_rewarder
dtype: float64
- name: hps_v2_rewarder
dtype: float64
splits:
- name: train
num_bytes: 205400
num_examples: 2400
download_size: 49480
dataset_size: 205400
---
# Dataset Card for "t2i_reward_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Slichi/universe | ---
license: openrail
---
|
IndonesiaAI/dpo-dataset | ---
dataset_info:
features:
- name: qid
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 14256438583.53102
num_examples: 3798835
- name: test
num_bytes: 1584049565.4689815
num_examples: 422093
download_size: 8864292488
dataset_size: 15840488149.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
DynamicSuperb/StressDetection_MIRSD | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: label
dtype: string
- name: word
dtype: string
- name: instruction
dtype: string
splits:
- name: test
num_bytes: 19241423.221727517
num_examples: 200
download_size: 17768718
dataset_size: 19241423.221727517
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "stress_dection_MIR_SD"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/tsubaki_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tsubaki/春日ツバキ/椿 (Blue Archive)
This is the dataset of tsubaki/春日ツバキ/椿 (Blue Archive), containing 500 images and their tags.
The core tags of this character are `black_hair, short_hair, animal_ears, breasts, large_breasts, hair_between_eyes, red_halo, halo, black_eyes, tassel, raccoon_ears`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 952.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsubaki_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 784.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tsubaki_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1379 | 1.60 GiB | [Download](https://huggingface.co/datasets/CyberHarem/tsubaki_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tsubaki_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, blush, elbow_gloves, red_gloves, rope, solo, looking_at_viewer, red_sailor_collar, red_skirt, simple_background, white_background, armpits, arms_up, open_mouth, underboob, arms_behind_head, breast_curtain, sweat, sideboob, sideless_outfit |
| 1 | 12 |  |  |  |  |  | 1girl, elbow_gloves, red_gloves, red_sailor_collar, red_skirt, revealing_clothes, sideboob, simple_background, solo, thighs, white_background, blush, looking_at_viewer, pleated_skirt, bare_shoulders, sideless_outfit, breast_curtain, underboob, shimenawa, two-tone_shirt |
| 2 | 14 |  |  |  |  |  | 1boy, 1girl, blush, hetero, penis, pussy, red_sailor_collar, vaginal, open_mouth, solo_focus, elbow_gloves, mosaic_censoring, red_gloves, spread_legs, huge_breasts, nipples, sweat, red_skirt, breast_curtain, clothed_sex, on_back, rope_belt, cowgirl_position, cum, girl_on_top, looking_at_viewer, on_bed, thighs, two-tone_shirt |
| 3 | 6 |  |  |  |  |  | 1girl, bare_shoulders, cleavage, navel, open_mouth, stomach, blush, solo, alternate_costume, cowboy_shot, looking_at_viewer, wet, white_bikini, choker, collarbone, halterneck, red_flower, sarong, simple_background, smile, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | elbow_gloves | red_gloves | rope | solo | looking_at_viewer | red_sailor_collar | red_skirt | simple_background | white_background | armpits | arms_up | open_mouth | underboob | arms_behind_head | breast_curtain | sweat | sideboob | sideless_outfit | revealing_clothes | thighs | pleated_skirt | bare_shoulders | shimenawa | two-tone_shirt | 1boy | hetero | penis | pussy | vaginal | solo_focus | mosaic_censoring | spread_legs | huge_breasts | nipples | clothed_sex | on_back | rope_belt | cowgirl_position | cum | girl_on_top | on_bed | cleavage | navel | stomach | alternate_costume | cowboy_shot | wet | white_bikini | choker | collarbone | halterneck | red_flower | sarong | smile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:---------------|:-------------|:-------|:-------|:--------------------|:--------------------|:------------|:--------------------|:-------------------|:----------|:----------|:-------------|:------------|:-------------------|:-----------------|:--------|:-----------|:------------------|:--------------------|:---------|:----------------|:-----------------|:------------|:-----------------|:-------|:---------|:--------|:--------|:----------|:-------------|:-------------------|:--------------|:---------------|:----------|:--------------|:----------|:------------|:-------------------|:------|:--------------|:---------|:-----------|:--------|:----------|:--------------------|:--------------|:------|:---------------|:---------|:-------------|:-------------|:-------------|:---------|:--------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | X | X | X | | X | X | X | X | X | X | | | | X | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | X | X | X | | | X | X | X | | | | | X | | | X | X | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | | | | X | X | | | X | X | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Dogge/bluemoon-fandom-1-1-rp-cleaned-korean-tranlated | ---
license: wtfpl
---
|
satendra4u2022/dpo | ---
license: mit
---
|
tashkeela | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- ar
license:
- gpl-2.0
multilinguality:
- monolingual
size_categories:
- n<1K
source_datasets:
- original
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
paperswithcode_id: null
pretty_name: Tashkeela
tags:
- diacritics-prediction
dataset_info:
features:
- name: text
dtype: string
- name: book
dtype: string
config_name: plain_text
splits:
- name: train
num_bytes: 1081110249
num_examples: 97
download_size: 183393530
dataset_size: 1081110249
---
# Dataset Card for Tashkeela
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Tashkeela](https://sourceforge.net/projects/tashkeela/)
- **Repository:** [Tashkeela](https://sourceforge.net/projects/tashkeela/)
- **Paper:** [Tashkeela: Novel corpus of Arabic vocalized texts, data for auto-diacritization systems](https://www.sciencedirect.com/science/article/pii/S2352340917300112)
- **Point of Contact:** [Taha Zerrouki](mailto:t_zerrouki@esi.dz)
### Dataset Summary
It contains 75 million of fully vocalized words mainly
97 books from classical and modern Arabic language.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The dataset is based on Arabic.
## Dataset Structure
### Data Instances
```
{'book': 'zip://Tashkeela-arabic-diacritized-text-utf8-0.3/texts.txt/msa/al-kalema.org/أشكال-التجارب-في-مَثَل-الزارع.htm.txt::https://sourceforge.net/projects/tashkeela/files/latest/download',
'text': 'الكلمة\n\n\nصفحه اصلی\nاشترك\nالكتاب المقدس\nجميع المقالات\nالترتيب بالموضوع\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nهذا المقال على نسخة PDF\n\n\nأشكال التجارب في مَثَل الزارع\n\n\tقد رأينا في مقال " \nوسائل واشكال التجارب" الأشكال التي من الممكن أن تتخذها التجارب (وخاصة الاختبارات التي تأتي من خلال الآلام والاضطهاد وأشراك إطاعة شهوات الإنسان العتيق، الجسد)، نستطيع أيضاً أن نرى هذه الأقسام عاملة في مثال الزارع. هناك مجموعتين في مثال الزارع أنه برغم من سماعهم واستقبالهم للكلمة، إلا أنهم لم يجلبوا ثماراً. والسؤال هو لماذا؟\n\n1. التجارب في القسم الثاني من مثال الزارع\n\nفيما يخص القسم الثاني من مثال الزارع، تخبرنا عنها متى 13: 20- 21 ولوقا 8: 13 \nمتى 13: 20- 21\n" وَالْمَزْرُوعُ عَلَى الأَمَاكِنِ الْمُحْجِرَةِ هُوَ الَّذِي يَسْمَعُ الْكَلِمَةَ، وَحَالاً يَقْبَلُهَا بِفَرَحٍ، وَلكِنْ لَيْسَ لَهُ أَصْلٌ فِي ذَاتِهِ، بَلْ هُوَ إِلَى حِينٍ. فَإِذَا حَدَثَ ضِيقٌ أَوِ اضْطِهَادٌ مِنْ أَجْلِ الْكَلِمَةِ فَحَالاً يَعْثُرُ."\nلوقا 8: 13\n" وَالَّذِينَ عَلَى الصَّخْرِ هُمُ الَّذِينَ مَتَى سَمِعُوا يَقْبَلُونَ الْكَلِمَةَ بِفَرَحٍ، وَهؤُلاَءِ لَيْسَ لَهُمْ أَصْلٌ، فَيُؤْمِنُونَ إِلَى حِينٍ، وَفِي وَقْتِ التَّجْرِبَةِ يَرْتَدُّونَ."\n\nكما نرى، الناس في هذا القسم سمعوا الكلمة وحالاً قبلوها بفرح! بمعنى آخر، لقد كانوا متحمسين جداً تجاه الكلمة. ثم جاءت التجارب والاختبارات في شكل ضيق واضطهاد من أجل الكلمة، أي أنه بسبب الكلمة، اضطهد هؤلاء الناس. وعندئذ توقفوا. عوضاً عن أن يحفظوا ويتمسكوا بالكلمة التي قد حدث واستقبلوها بفرح، تراجعوا وسقطوا بعيداً، إن كنت مؤمناً صغيراً مليء بالحماسة تجاه الله، وبالرغم من أنه قد يبدو أنه لا يوجد شيطان من حولك، فهذا لن يستمر إلى الأبد. فالتجارب والاختبارات آتية. ستحتاج إلى أن تحفظ وتتمسك بالإيمان وبالكلمة التي قد حدث واستقبلتها بفرح. كما تقول لنا الكلمة:\nعبرانيين 10: 35- 39\n" فَلاَ تَطْرَحُوا ثِقَتَكُمُ الَّتِي لَهَا مُجَازَاةٌ عَظِيمَةٌ. لأَنَّكُمْ تَحْتَاجُونَ إِلَى الصَّبْرِ، حَتَّى إِذَا صَنَعْتُمْ مَشِيئَةَ اللهِ تَنَالُونَ الْمَوْعِدَ. لأَنَّهُ بَعْدَ قَلِيل جِدًّا «سَيَأْتِي الآتِي وَلاَ يُبْطِئُ. أَمَّا الْبَارُّ فَبِالإِيمَانِ يَحْيَا، وَإِنِ ارْتَدَّ لاَ تُسَرُّ بِهِ نَفْسِي». وَأَمَّا نَحْنُ فَلَسْنَا مِنَ الارْتِدَادِ لِلْهَلاَكِ، بَلْ مِنَ الإِيمَانِ لاقْتِنَاءِ النَّفْسِ."\n\nوالضيق قد يأخذ أشكالاً عديدة. رأيت أناساً يسقطون، تاركين الإيمان لأن آبائهم أو أقاربهم وأصدقائهم قد عارضوهم ورفضوهم بسبب إيمانهم. بالطبع قد يأخذ الاضطهاد أشكالاً أكثر من ذلك أيضاً، مثل أن تلقى في سجن أو أن تعذب لأجل إيمانك. قد يسبب الموت كذلك، كما حدث مع اسطفانوس ويعقوب أخو يوحنا. وتقول الكلمة من أجلك ومن أجل كل الذين حوكموا:\nرومية 16: 19- 20\n" لأَنَّ طَاعَتَكُمْ ذَاعَتْ إِلَى الْجَمِيعِ، فَأَفْرَحُ أَنَا بِكُمْ، وَأُرِيدُ أَنْ تَكُونُوا حُكَمَاءَ لِلْخَيْرِ وَبُسَطَاءَ لِلشَّرِّ. وَإِلهُ السَّلاَمِ سَيَسْحَقُ الشَّيْطَانَ تَحْتَ أَرْجُلِكُمْ سَرِيعًا."\nو بطرس الأولى 5: 8- 10\n" اُصْحُوا وَاسْهَرُوا. لأَنَّ إِبْلِيسَ خَصْمَكُمْ كَأَسَدٍ زَائِرٍ، يَجُولُ مُلْتَمِسًا مَنْ يَبْتَلِعُهُ هُوَ. فَقَاوِمُوهُ، رَاسِخِينَ فِي الإِيمَانِ، عَالِمِينَ أَنَّ نَفْسَ هذِهِ الآلاَمِ تُجْرَى عَلَى إِخْوَتِكُمُ الَّذِينَ فِي الْعَالَمِ. وَإِلهُ كُلِّ نِعْمَةٍ الَّذِي دَعَانَا إِلَى مَجْدِهِ الأَبَدِيِّ فِي الْمَسِيحِ يَسُوعَ، بَعْدَمَا تَأَلَّمْتُمْ يَسِيرًا، هُوَ يُكَمِّلُكُمْ، وَيُثَبِّتُكُمْ، وَيُقَوِّيكُمْ، وَيُمَكِّنُكُمْ."\n\nتمسك بالإيمان حتى النهاية. ضع حياتك ووضعك بين يدي الله وكن مستعداً لمواجهة أي شيء قد يحدث، أجل وحتى السخرية والعذاب. الله معك، سيقويك وسيعينك تماماً مثلما فعل مع يسوع في بستان جسثيماني. وتماماً مثلما فعل مع بولس في السجن عندما اضطهد من قِبَل اليهود (أعمال الرسل 23: 11). وكما قال بولس في كورنثوس الثانية 1: 7:" عَالِمِينَ أَنَّكُمْ كَمَا أَنْتُمْ شُرَكَاءُ فِي الآلاَمِ، كَذلِكَ فِي التَّعْزِيَةِ أَيْضًا." فالعزاء الآتي من الله يوازن أي سخرية أو أي عذاب قد يأتي إلينا من أي إنسان.\n\n2. التجارب في القسم الثالث من مثال الزارع\n\nبخصوص القسم الثالث من مثال الزارع، فنقرأ عنه في مرقس 4: 18- 19\n\n" وَهؤُلاَءِ هُمُ الَّذِينَ زُرِعُوا بَيْنَ الشَّوْكِ: هؤُلاَءِ هُمُ الَّذِينَ يَسْمَعُونَ الْكَلِمَةَ، وَهُمُومُ هذَا الْعَالَمِ وَغُرُورُ الْغِنَى وَشَهَوَاتُ سَائِرِ الأَشْيَاءِ تَدْخُلُ وَتَخْنُقُ الْكَلِمَةَ فَتَصِيرُ بِلاَ ثَمَرٍ."\nو لوقا 8: 14\n" وَالَّذِي سَقَطَ بَيْنَ الشَّوْكِ هُمُ الَّذِينَ يَسْمَعُونَ، ثُمَّ يَذْهَبُونَ فَيَخْتَنِقُونَ مِنْ هُمُومِ الْحَيَاةِ وَغِنَاهَا وَلَذَّاتِهَا، وَلاَ يُنْضِجُونَ ثَمَرًا."\n\nهؤلاء قد سمعوا الكلمة وفهموها ولكنهم صاروا بلا ثمر، وما هو السبب؟ السبب هو لأنهم تركوا أبواب قلوبهم مفتوحة لأشواك " وَهُمُومُ هذَا الْعَالَمِ وَغُرُورُ الْغِنَى وَشَهَوَاتُ سَائِرِ الأَشْيَاءِ" (مرقس 4: 19)، والتي تدخل فتخنق الكلمة، كما رأينا يعقوب دائماً ما يقول:\nيعقوب 1: 13- 15\n" لاَ يَقُلْ أَحَدٌ إِذَا جُرِّبَ: «إِنِّي أُجَرَّبُ مِنْ قِبَلِ اللهِ»، لأَنَّ اللهَ غَيْرُ مُجَرَّبٍ بِالشُّرُورِ، وَهُوَ لاَ يُجَرِّبُ أَحَدًا. وَلكِنَّ كُلَّ وَاحِدٍ يُجَرَّبُ إِذَا انْجَذَبَ وَانْخَدَعَ مِنْ شَهْوَتِهِ. ثُمَّ الشَّهْوَةُ إِذَا حَبِلَتْ تَلِدُ خَطِيَّةً، وَالْخَطِيَّةُ إِذَا كَمَلَتْ تُنْتِجُ مَوْتًا."\nوتيموثاوس الأولى 6: 9 تقول لنا\n" وَأَمَّا الَّذِينَ يُرِيدُونَ أَنْ يَكُونُوا أَغْنِيَاءَ، فَيَسْقُطُونَ فِي تَجْرِبَةٍ وَفَخٍّ وَشَهَوَاتٍ كَثِيرَةٍ غَبِيَّةٍ وَمُضِرَّةٍ، تُغَرِّقُ النَّاسَ فِي الْعَطَبِ وَالْهَلاَكِ."\n\nيجب أن نلاحظ شيئاً هنا: أن تأثير هموم الحياة هو نفس التأثير الذي لتجارب الغنى وشهوات الأشياء الأخرى. فهموم الحياة أيضاً لا تجلب الثمار، إذاً فإن اردت أن تكون مسيحياً مثمراً، أي مسيحي حقيقي وليس فقط مسيحي اسمي، فيجب عليك أن تزيل أشواك الهموم والغنى وملذات الحياة وأن تمنعهم من العودة مرة أخرى. تحتاج إلى أن تفعل شيئاً، تحتاج إلى أن تتغير والله سيعينك في هذا إن كنت حقاً تريده. التجارب في القسم الثالث من مثال الزارع لا تأتي من خلال الاضطهاد والآلام عن طريق الشيطان. ولكن هنا تأخذ التجارب صوراً أكثر مكراً والتي مع هذا تتطلب مقاومتنا. الاهتمام بما يهتم به هذا العالم ("هموم هذا العالم")، الرغبة في الغنى أو اشتهاء الأشياء الأخرى هي أمور خطيرة جداً. إنها أشواك يجب إزالتها. كما رأينا بولس يقول:\nرومية 13: 14\n" بَلِ الْبَسُوا الرَّبَّ يَسُوعَ الْمَسِيحَ، وَلاَ تَصْنَعُوا تَدْبِيرًا لِلْجَسَدِ لأَجْلِ الشَّهَوَاتِ."\n\n" لاَ تَصْنَعُوا تَدْبِيرًا لِلْجَسَدِ" والتي تعني أنه يجب علينا أن لا نهتم بالجسد وشهواته. ولكن عوضاً عن ذلك ينبغي لنا أن نطعم أنفسنا بلبن الكلمة الصافي الذي ننمو بواستطه (بطرس الأولى 2: 2).\n\n\nتاسوس كيولاشوجلو'}
```
### Data Fields
- `book` (str): Book filename.
- `text` (str): Text of the book.
### Data Splits
The dataset is not split.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
The Modern Standard Arabic texts crawled from the Internet.
#### Who are the source language producers?
Websites.
### Annotations
The dataset does not contain any additional annotations.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[GNU General Public License, version 2 (GPLv2)](https://opensource.org/licenses/GPL-2.0).
### Citation Information
The dataset was published on this [paper](https://www.sciencedirect.com/science/article/pii/S2352340917300112#!):
```
@article{zerrouki2017tashkeela,
title={Tashkeela: Novel corpus of Arabic vocalized texts, data for auto-diacritization systems},
author={Zerrouki, Taha and Balla, Amar},
journal={Data in brief},
volume={11},
pages={147},
year={2017},
publisher={Elsevier}
}
```
### Contributions
Thanks to [@zaidalyafeai](https://github.com/zaidalyafeai) for adding this dataset. |
CyberHarem/nearl_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nearl/ニアール/临光 (Arknights)
This is the dataset of nearl/ニアール/临光 (Arknights), containing 353 images and their tags.
The core tags of this character are `animal_ears, blonde_hair, long_hair, horse_ears, animal_ear_fluff, horse_girl, yellow_eyes, tail, ponytail, horse_tail, hair_between_eyes, breasts, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 353 | 649.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nearl_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 353 | 537.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nearl_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 917 | 1.03 GiB | [Download](https://huggingface.co/datasets/CyberHarem/nearl_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nearl_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, bare_shoulders, cowboy_shot, crop_top, looking_at_viewer, midriff, navel, solo, stomach, black_pants, black_sports_bra, leggings, simple_background, standing, sweat, alternate_costume, blush, thighs, white_background, armpits, arms_behind_head, arms_up, bare_arms, cropped_legs, parted_lips, smile, stretching, wristband |
| 1 | 5 |  |  |  |  |  | 1girl, bare_arms, bare_shoulders, cleavage, looking_at_viewer, midriff, navel, solo, stomach, black_shorts, short_shorts, simple_background, smile, standing, thighs, alternate_costume, black_sports_bra, cowboy_shot, crop_top, hand_up, medium_breasts, abs, bike_shorts, dolphin_shorts, parted_lips, very_long_hair, white_background |
| 2 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, nipples, solo, collarbone, completely_nude, navel, pussy, cowboy_shot, standing, stomach, sidelocks, smile |
| 3 | 5 |  |  |  |  |  | 1girl, blush, horse_penis, huge_penis, looking_at_viewer, solo, anus, ass, simple_background, from_behind, full-package_futanari, huge_breasts, huge_testicles, looking_back, nipples, pussy, uncensored, centaur, completely_nude, cum, erection, grey_background, monster_girl, multiple_legs, parted_lips, torn_clothes |
| 4 | 5 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, official_alternate_costume, outdoors, solo, swimsuit_cover-up, white_one-piece_swimsuit, cloud, covered_navel, cowboy_shot, off_shoulder, parted_lips, smile, blue_sky, competition_swimsuit, day, groin, medium_breasts, open_jacket, see-through, standing, thigh_strap, ass_visible_through_thighs, bird, blue_jacket, collarbone, eyewear_on_head, long_sleeves, sunglasses |
| 5 | 7 |  |  |  |  |  | 1girl, bare_shoulders, cowboy_shot, solo, thighs, white_one-piece_swimsuit, collarbone, competition_swimsuit, covered_navel, eyewear_on_head, looking_at_viewer, official_alternate_costume, standing, sunglasses, outdoors, water, arm_up, bare_arms, day, smile, blue_sky, blush, cloud, hand_up, swimsuit_cover-up, thigh_strap, wading, wet |
| 6 | 9 |  |  |  |  |  | 1girl, closed_mouth, headset, implied_extra_ears, looking_at_viewer, simple_background, solo, white_background, portrait, upper_body, smile, black_scarf, shoulder_armor |
| 7 | 15 |  |  |  |  |  | 1girl, breastplate, headset, solo, upper_body, pauldrons, scarf, sidelocks, simple_background, closed_mouth, white_background, looking_at_viewer, parted_lips |
| 8 | 6 |  |  |  |  |  | 1girl, headset, solo, weapon, breastplate, holding, looking_at_viewer, orange_eyes, pauldrons, scarf, shield, sidelocks, upper_body, v-shaped_eyebrows, open_mouth |
| 9 | 19 |  |  |  |  |  | 1girl, solo, breastplate, headset, black_gloves, looking_at_viewer, pauldrons, sidelocks, holding_weapon, holding_shield, black_scarf, cowboy_shot, black_dress, closed_mouth, headphones |
| 10 | 14 |  |  |  |  |  | 1girl, breastplate, full_body, pauldrons, solo, black_gloves, headset, looking_at_viewer, standing, black_dress, black_footwear, sidelocks, black_skirt, high_heel_boots, black_scarf, simple_background, holding_weapon, torn_clothes, holding_shield, torn_scarf, closed_mouth, white_background, axe, floating_hair |
| 11 | 5 |  |  |  |  |  | 1girl, headset, looking_at_viewer, shoulder_armor, solo, black_gloves, holding_weapon, implied_extra_ears, upper_body, belt |
| 12 | 7 |  |  |  |  |  | 1girl, headset, implied_extra_ears, looking_at_viewer, solo, white_dress, black_gloves, official_alternate_costume, cowboy_shot, holding_weapon, simple_background, belt, shoulder_armor, black_background, medium_breasts, orange_eyes, polearm |
| 13 | 6 |  |  |  |  |  | 1girl, headset, holding_polearm, implied_extra_ears, solo, armored_boots, looking_at_viewer, shoulder_armor, standing, white_dress, black_gloves, full_body, spear, outdoors, white_coat |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | cowboy_shot | crop_top | looking_at_viewer | midriff | navel | solo | stomach | black_pants | black_sports_bra | leggings | simple_background | standing | sweat | alternate_costume | blush | thighs | white_background | armpits | arms_behind_head | arms_up | bare_arms | cropped_legs | parted_lips | smile | stretching | wristband | cleavage | black_shorts | short_shorts | hand_up | medium_breasts | abs | bike_shorts | dolphin_shorts | very_long_hair | nipples | collarbone | completely_nude | pussy | sidelocks | horse_penis | huge_penis | anus | ass | from_behind | full-package_futanari | huge_breasts | huge_testicles | looking_back | uncensored | centaur | cum | erection | grey_background | monster_girl | multiple_legs | torn_clothes | official_alternate_costume | outdoors | swimsuit_cover-up | white_one-piece_swimsuit | cloud | covered_navel | off_shoulder | blue_sky | competition_swimsuit | day | groin | open_jacket | see-through | thigh_strap | ass_visible_through_thighs | bird | blue_jacket | eyewear_on_head | long_sleeves | sunglasses | water | arm_up | wading | wet | closed_mouth | headset | implied_extra_ears | portrait | upper_body | black_scarf | shoulder_armor | breastplate | pauldrons | scarf | weapon | holding | orange_eyes | shield | v-shaped_eyebrows | open_mouth | black_gloves | holding_weapon | holding_shield | black_dress | headphones | full_body | black_footwear | black_skirt | high_heel_boots | torn_scarf | axe | floating_hair | belt | white_dress | black_background | polearm | holding_polearm | armored_boots | spear | white_coat |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-----------------|:--------------|:-----------|:--------------------|:----------|:--------|:-------|:----------|:--------------|:-------------------|:-----------|:--------------------|:-----------|:--------|:--------------------|:--------|:---------|:-------------------|:----------|:-------------------|:----------|:------------|:---------------|:--------------|:--------|:-------------|:------------|:-----------|:---------------|:---------------|:----------|:-----------------|:------|:--------------|:-----------------|:-----------------|:----------|:-------------|:------------------|:--------|:------------|:--------------|:-------------|:-------|:------|:--------------|:------------------------|:---------------|:-----------------|:---------------|:-------------|:----------|:------|:-----------|:------------------|:---------------|:----------------|:---------------|:-----------------------------|:-----------|:--------------------|:---------------------------|:--------|:----------------|:---------------|:-----------|:-----------------------|:------|:--------|:--------------|:--------------|:--------------|:-----------------------------|:-------|:--------------|:------------------|:---------------|:-------------|:--------|:---------|:---------|:------|:---------------|:----------|:---------------------|:-----------|:-------------|:--------------|:-----------------|:--------------|:------------|:--------|:---------|:----------|:--------------|:---------|:--------------------|:-------------|:---------------|:-----------------|:-----------------|:--------------|:-------------|:------------|:-----------------|:--------------|:------------------|:-------------|:------|:----------------|:-------|:--------------|:-------------------|:----------|:------------------|:----------------|:--------|:-------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | | X | X | | X | | X | X | | | | X | | X | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | X | | X | | X | X | X | | | | | X | | | X | | | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | | X | | | X | | | | | X | | | | X | | | | | | | | X | | | | | | | | | | | | | X | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | | X | | | X | | | | | | X | | | | | | | | | | | X | X | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | X | X | | X | | | X | | | | | | X | | | X | X | | | | | X | | | X | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | X | X | X | | | | X | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | | | | X | | | X | | | | | X | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 15 |  |  |  |  |  | X | | | | X | | | X | | | | | X | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 9 | 19 |  |  |  |  |  | X | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | X | | X | X | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | |
| 10 | 14 |  |  |  |  |  | X | | | | X | | | X | | | | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | X | | X | X | | | | | | | | X | X | X | X | | X | X | X | X | X | X | X | | | | | | | | |
| 11 | 5 |  |  |  |  |  | X | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | X | | X | | | | | | | | | | X | X | | | | | | | | | | | X | | | | | | | |
| 12 | 7 |  |  |  |  |  | X | | X | | X | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | X | | | | | | X | | | | X | X | | | | | | | | | | | X | X | X | X | | | | |
| 13 | 6 |  |  |  |  |  | X | | | | X | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | X | | | | | | | | | | X | | | | | X | | | | | | | | X | | | X | X | X | X |
|
simonveitner/samsum_rewritten_0_to_1200 | ---
dataset_info:
features:
- name: orginal_text
dtype: string
- name: rewrite_prompt
dtype: string
- name: rewritten_text
dtype: string
splits:
- name: train
num_bytes: 421524
num_examples: 1200
download_size: 263197
dataset_size: 421524
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ZhangShenao/0.00045_idpo_decalpha_ref_response | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
- name: reference_response
dtype: string
splits:
- name: train_prefs_1
num_bytes: 164111773
num_examples: 20378
- name: test_prefs_1
num_bytes: 16019213
num_examples: 2000
download_size: 99390696
dataset_size: 180130986
configs:
- config_name: default
data_files:
- split: train_prefs_1
path: data/train_prefs_1-*
- split: test_prefs_1
path: data/test_prefs_1-*
---
# Dataset Card for "0.00045_idpo_decalpha_ref_response"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fathyshalab/reklambox-balanced-no-stopwords | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: label_name
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 401120
num_examples: 1102
- name: test
num_bytes: 140041
num_examples: 276
download_size: 335528
dataset_size: 541161
---
# Dataset Card for "reklambox-balanced-no-stopwords"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/matsuda_arisa_theidolmstermillionlive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of matsuda_arisa/松田亜利沙 (THE iDOLM@STER: Million Live!)
This is the dataset of matsuda_arisa/松田亜利沙 (THE iDOLM@STER: Million Live!), containing 134 images and their tags.
The core tags of this character are `brown_hair, twintails, long_hair, brown_eyes, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 134 | 132.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsuda_arisa_theidolmstermillionlive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 134 | 91.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsuda_arisa_theidolmstermillionlive/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 280 | 175.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsuda_arisa_theidolmstermillionlive/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 134 | 120.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsuda_arisa_theidolmstermillionlive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 280 | 221.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/matsuda_arisa_theidolmstermillionlive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/matsuda_arisa_theidolmstermillionlive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, open_mouth, skirt, solo, :d, blush, boots, hair_bow, jewelry |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | open_mouth | skirt | solo | :d | blush | boots | hair_bow | jewelry |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------------|:--------|:-------|:-----|:--------|:--------|:-----------|:----------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X |
|
mangostin2010/Korean-Wise-Saying | ---
license: unknown
---
|
hirxn/custom_data | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 16378
num_examples: 7
download_size: 15015
dataset_size: 16378
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BangumiBase/mawarupenguindrum | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Mawaru Penguindrum
This is the image base of bangumi Mawaru Penguindrum, we detected 23 characters, 1725 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 19 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 177 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 81 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 18 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 76 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 206 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 19 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 14 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 64 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 11 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 313 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 24 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 11 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 306 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 19 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 19 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 13 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 16 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 37 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 17 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 17 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 8 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 240 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
memray/kp20k | ---
license: cc-by-nc-sa-4.0
---
|
crylake/facesyntheticsspigacaptioned_30percent | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: spiga_seg
dtype: image
- name: image_caption
dtype: string
splits:
- name: train
num_bytes: 9080640177.0
num_examples: 30000
download_size: 9066954510
dataset_size: 9080640177.0
---
# Dataset Card for "facesyntheticsspigacaptioned_30percent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
indiejoseph/wikipedia-zh-filtered | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 258992903
num_examples: 44344
download_size: 164712496
dataset_size: 258992903
---
# Dataset Card for "wikipedia-zh-filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rpadilla/ft-capstone2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 36223
num_examples: 12
- name: test
num_bytes: 23728
num_examples: 7
download_size: 57751
dataset_size: 59951
---
# Dataset Card for "ft-capstone2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kanishka/counterfactual-babylm-only_measure_nps_as_singular_removal | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 581821016
num_examples: 11666570
- name: validation
num_bytes: 56120230
num_examples: 1026747
download_size: 421803322
dataset_size: 637941246
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
nakayama/hh-rlhf-helpful-base-ja | ---
license: mit
language:
- ja
---
https://github.com/anthropics/hh-rlhf の内容のうち、helpful-base内のchosenに記載されている英文をfuguMTで翻訳、うまく翻訳できていないものを除外、修正したものです。
|
gsh3729/sw2 | ---
dataset_info:
features:
- name: filename
dtype: string
- name: tif
dtype: binary
- name: tfw
dtype: binary
splits:
- name: train
num_bytes: 22663
num_examples: 2
download_size: 23838
dataset_size: 22663
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kyusque/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: labels
list:
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: id
dtype: int64
- name: name
dtype: string
- name: node_id
dtype: string
- name: url
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: assignees
list:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: milestone
struct:
- name: closed_at
dtype: string
- name: closed_issues
dtype: int64
- name: created_at
dtype: string
- name: creator
struct:
- name: avatar_url
dtype: string
- name: events_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: gravatar_id
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: login
dtype: string
- name: node_id
dtype: string
- name: organizations_url
dtype: string
- name: received_events_url
dtype: string
- name: repos_url
dtype: string
- name: site_admin
dtype: bool
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: description
dtype: string
- name: due_on
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: labels_url
dtype: string
- name: node_id
dtype: string
- name: number
dtype: int64
- name: open_issues
dtype: int64
- name: state
dtype: string
- name: title
dtype: string
- name: updated_at
dtype: string
- name: url
dtype: string
- name: comments
sequence: string
- name: created_at
dtype: timestamp[ns, tz=UTC]
- name: updated_at
dtype: timestamp[ns, tz=UTC]
- name: closed_at
dtype: timestamp[ns, tz=UTC]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: float64
- name: draft
dtype: float64
- name: pull_request
struct:
- name: diff_url
dtype: string
- name: html_url
dtype: string
- name: merged_at
dtype: string
- name: patch_url
dtype: string
- name: url
dtype: string
- name: body
dtype: string
- name: reactions
struct:
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: confused
dtype: int64
- name: eyes
dtype: int64
- name: heart
dtype: int64
- name: hooray
dtype: int64
- name: laugh
dtype: int64
- name: rocket
dtype: int64
- name: total_count
dtype: int64
- name: url
dtype: string
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: float64
- name: state_reason
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 30854747
num_examples: 6106
download_size: 8722074
dataset_size: 30854747
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dinaaaaaa/lima_rand_sel_50_preference_self_reward | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: chosen-rating
dtype: int64
- name: rejected
dtype: string
- name: rejected-rating
dtype: int64
splits:
- name: train
num_bytes: 177447
num_examples: 223
download_size: 44667
dataset_size: 177447
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
automated-research-group/llama2_7b_chat-commonsense_qa-results | ---
dataset_info:
config_name: '{''do_sample''=False, ''beams''=1}'
features:
- name: id
dtype: string
- name: prediction
dtype: string
- name: commonsense_qa_accuracy
dtype: bool
splits:
- name: train
num_bytes: 159243
num_examples: 1221
download_size: 83126
dataset_size: 159243
configs:
- config_name: '{''do_sample''=False, ''beams''=1}'
data_files:
- split: train
path: '{''do_sample''=False, ''beams''=1}/train-*'
---
|
avankumar/new_data_model_methanol_lca_500 | ---
dataset_info:
features:
- name: Train
dtype: string
splits:
- name: train
num_bytes: 378305
num_examples: 502
download_size: 145305
dataset_size: 378305
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ImperialIndians23/nlp_cw_data_processed | ---
dataset_info:
features:
- name: par_id
dtype: string
- name: community
dtype: string
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1775055
num_examples: 8375
- name: valid
num_bytes: 435628
num_examples: 2094
download_size: 1354140
dataset_size: 2210683
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
---
|
sudeepag/sampled-t0_fsnoopt_data | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: _template_idx
dtype: int64
- name: _task_source
dtype: string
- name: _task_name
dtype: string
- name: _template_type
dtype: string
splits:
- name: train
num_bytes: 7365274691.188863
num_examples: 3219106
download_size: 4167856318
dataset_size: 7365274691.188863
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
NebulaeWis/gelbooru_images | ---
task_categories:
- text-to-image
language:
- en
pretty_name: gelbooru
size_categories:
- 1M<n<10M
---
Collect images from https://gelbooru.com/
id range:0~9393795
encoding: UTF-8
search tags:"-animated -3d_(artwork) -webm -gif -video -real_life -comic -photo_(medium)"
max shortest edge size ==1536 ,save using .webp with 90%quality
The total number search iamges is 8364374, filtered out 18832.
image not in it:
gif/video
truncated(more than 10+ repeat download)
too large(over pillow default limit pixels)
In the metainfo last 5 columns,[artist,character,copyright,metadata,tags],"None" means lack of anything, rather than string "None".
*.txt from the crawler results,it's' not captions.
please build captions from metainfo and tagger
Disclaimer
Disclaimer: By downloading or using this dataset, you agree to the following terms and conditions:
Purpose of Crawling: The dataset is obtained by crawling a publicly available website. The purpose of this crawling behavior is to upload the dataset to Hugging Face in order to alleviate the load on the original booru site.
Data Accuracy: We make efforts to ensure the accuracy of the dataset, but we cannot guarantee the completeness and accuracy of the data. Users are responsible for evaluating the quality and accuracy of the dataset and bear any consequences arising from inaccurate or incomplete data.
Full Responsibility: The uploader of this dataset shall not be liable for any losses or damages (including but not limited to any direct, indirect, incidental damages) arising from the use, misuse, or inability to use the dataset in any way.
Please read and understand the above terms and conditions carefully before using this dataset. If you do not agree to these terms and conditions, you are not allowed to use this dataset. |
breno30/PhPacote | ---
license: openrail
---
|
AdapterOcean/med_alpaca_standardized_cluster_19_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 18706021
num_examples: 34698
download_size: 9286969
dataset_size: 18706021
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_19_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
StevenLe456/viet-tones | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype: int64
splits:
- name: train
num_bytes: 177262390.44
num_examples: 1080
download_size: 0
dataset_size: 177262390.44
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "viet-tones"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_seyf1elislam__KuTrix-7b | ---
pretty_name: Evaluation run of seyf1elislam/KuTrix-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [seyf1elislam/KuTrix-7b](https://huggingface.co/seyf1elislam/KuTrix-7b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_seyf1elislam__KuTrix-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-16T06:30:17.941313](https://huggingface.co/datasets/open-llm-leaderboard/details_seyf1elislam__KuTrix-7b/blob/main/results_2024-03-16T06-30-17.941313.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6576587418096124,\n\
\ \"acc_stderr\": 0.03201598935752146,\n \"acc_norm\": 0.6575488737220112,\n\
\ \"acc_norm_stderr\": 0.03267746108371606,\n \"mc1\": 0.5410036719706243,\n\
\ \"mc1_stderr\": 0.017444544447661203,\n \"mc2\": 0.7084705277313671,\n\
\ \"mc2_stderr\": 0.014694482049743158\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6834470989761092,\n \"acc_stderr\": 0.013592431519068077,\n\
\ \"acc_norm\": 0.7047781569965871,\n \"acc_norm_stderr\": 0.013329750293382318\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7027484564827724,\n\
\ \"acc_stderr\": 0.004561141293448457,\n \"acc_norm\": 0.8794064927305317,\n\
\ \"acc_norm_stderr\": 0.0032498873947065044\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569526,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569526\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8532110091743119,\n \"acc_stderr\": 0.015173141845126243,\n \"\
acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.015173141845126243\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156861,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156861\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608303,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608303\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44581005586592176,\n\
\ \"acc_stderr\": 0.016623998513333103,\n \"acc_norm\": 0.44581005586592176,\n\
\ \"acc_norm_stderr\": 0.016623998513333103\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.02378858355165854,\n\
\ \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.02378858355165854\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4804432855280313,\n\
\ \"acc_stderr\": 0.012760464028289299,\n \"acc_norm\": 0.4804432855280313,\n\
\ \"acc_norm_stderr\": 0.012760464028289299\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.684640522875817,\n \"acc_stderr\": 0.018798086284886887,\n \
\ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.018798086284886887\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5410036719706243,\n\
\ \"mc1_stderr\": 0.017444544447661203,\n \"mc2\": 0.7084705277313671,\n\
\ \"mc2_stderr\": 0.014694482049743158\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.819258089976322,\n \"acc_stderr\": 0.010814911009613981\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7005307050796058,\n \
\ \"acc_stderr\": 0.012616300735519665\n }\n}\n```"
repo_url: https://huggingface.co/seyf1elislam/KuTrix-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|arc:challenge|25_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|gsm8k|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hellaswag|10_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T06-30-17.941313.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-16T06-30-17.941313.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- '**/details_harness|winogrande|5_2024-03-16T06-30-17.941313.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-16T06-30-17.941313.parquet'
- config_name: results
data_files:
- split: 2024_03_16T06_30_17.941313
path:
- results_2024-03-16T06-30-17.941313.parquet
- split: latest
path:
- results_2024-03-16T06-30-17.941313.parquet
---
# Dataset Card for Evaluation run of seyf1elislam/KuTrix-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [seyf1elislam/KuTrix-7b](https://huggingface.co/seyf1elislam/KuTrix-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_seyf1elislam__KuTrix-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-16T06:30:17.941313](https://huggingface.co/datasets/open-llm-leaderboard/details_seyf1elislam__KuTrix-7b/blob/main/results_2024-03-16T06-30-17.941313.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6576587418096124,
"acc_stderr": 0.03201598935752146,
"acc_norm": 0.6575488737220112,
"acc_norm_stderr": 0.03267746108371606,
"mc1": 0.5410036719706243,
"mc1_stderr": 0.017444544447661203,
"mc2": 0.7084705277313671,
"mc2_stderr": 0.014694482049743158
},
"harness|arc:challenge|25": {
"acc": 0.6834470989761092,
"acc_stderr": 0.013592431519068077,
"acc_norm": 0.7047781569965871,
"acc_norm_stderr": 0.013329750293382318
},
"harness|hellaswag|10": {
"acc": 0.7027484564827724,
"acc_stderr": 0.004561141293448457,
"acc_norm": 0.8794064927305317,
"acc_norm_stderr": 0.0032498873947065044
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569526,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569526
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.015173141845126243,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.015173141845126243
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156861,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156861
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608303,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608303
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44581005586592176,
"acc_stderr": 0.016623998513333103,
"acc_norm": 0.44581005586592176,
"acc_norm_stderr": 0.016623998513333103
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.02378858355165854,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.02378858355165854
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4804432855280313,
"acc_stderr": 0.012760464028289299,
"acc_norm": 0.4804432855280313,
"acc_norm_stderr": 0.012760464028289299
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.018798086284886887,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.018798086284886887
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5410036719706243,
"mc1_stderr": 0.017444544447661203,
"mc2": 0.7084705277313671,
"mc2_stderr": 0.014694482049743158
},
"harness|winogrande|5": {
"acc": 0.819258089976322,
"acc_stderr": 0.010814911009613981
},
"harness|gsm8k|5": {
"acc": 0.7005307050796058,
"acc_stderr": 0.012616300735519665
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
DeepFoldProtein/openfold_msa_contrastive_cards_000 | ---
dataset_info:
features:
- name: query_accession
dtype: string
- name: excludes
sequence: string
- name: query_sequence
dtype: string
- name: target_accessions
sequence: string
- name: target_sequences
sequence: string
splits:
- name: train
num_bytes: 1368566426
num_examples: 259752
download_size: 1342381431
dataset_size: 1368566426
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "openfold_msa_contrastive_cards_000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Lancelot53/srbd1_v2_annotated | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: xml
dtype: string
- name: html
dtype: string
- name: response
dtype: string
- name: annotated
dtype: string
splits:
- name: train
num_bytes: 29595348.121978022
num_examples: 1077
download_size: 3598400
dataset_size: 29595348.121978022
---
# Dataset Card for "srbd1_v2_annotated"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gokuls/wiki_book_corpus_processed_bert_dataset_small | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 5550400800.0
num_examples: 1541778
download_size: 1636779213
dataset_size: 5550400800.0
---
# Dataset Card for "wiki_book_corpus_processed_bert_dataset_small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/DTD_parition1_test_google_flan_t5_xxl_mode_T_A_ns_1880 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_Attributes_ViT_L_14_text_davinci_003_dtd_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 852862
num_examples: 1880
- name: fewshot_0_clip_tags_ViT_L_14_Attributes_ViT_L_14_text_davinci_003_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 1030510
num_examples: 1880
download_size: 477779
dataset_size: 1883372
---
# Dataset Card for "DTD_parition1_test_google_flan_t5_xxl_mode_T_A_ns_1880"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-samsum-samsum-1bb2ba-1486554327 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: SamuelAllen123/t5-efficient-large-nl36_fine_tune_sum
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: validation
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: SamuelAllen123/t5-efficient-large-nl36_fine_tune_sum
* Dataset: samsum
* Config: samsum
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@samuelallen123](https://huggingface.co/samuelallen123) for evaluating this model. |
Dahoas/unet-cifar10-32 | ---
dataset_info:
features:
- name: images
sequence:
sequence:
sequence: uint8
splits:
- name: train
num_bytes: 7110656
num_examples: 2048
download_size: 6350172
dataset_size: 7110656
---
# Dataset Card for "unet-cifar10-32"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank14_v3 | ---
pretty_name: Evaluation run of xxyyy123/10k_v1_lora_qkvo_rank14_v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xxyyy123/10k_v1_lora_qkvo_rank14_v3](https://huggingface.co/xxyyy123/10k_v1_lora_qkvo_rank14_v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank14_v3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-03T13:17:02.987872](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank14_v3/blob/main/results_2023-09-03T13%3A17%3A02.987872.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5091352266849982,\n\
\ \"acc_stderr\": 0.03495474191892426,\n \"acc_norm\": 0.5128128131582483,\n\
\ \"acc_norm_stderr\": 0.03493935725866389,\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5344202990692574,\n\
\ \"mc2_stderr\": 0.015729161957393895\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5298634812286689,\n \"acc_stderr\": 0.014585305840007105,\n\
\ \"acc_norm\": 0.5597269624573379,\n \"acc_norm_stderr\": 0.01450676952480424\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6050587532364071,\n\
\ \"acc_stderr\": 0.004878390226591715,\n \"acc_norm\": 0.7921728739294961,\n\
\ \"acc_norm_stderr\": 0.00404923158643323\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n\
\ \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n \
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731833,\n\
\ \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.041614023984032786,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.041614023984032786\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.45664739884393063,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.45664739884393063,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.29365079365079366,\n \"acc_stderr\": 0.023456037383982022,\n \"\
acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.023456037383982022\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5451612903225806,\n\
\ \"acc_stderr\": 0.028327743091561077,\n \"acc_norm\": 0.5451612903225806,\n\
\ \"acc_norm_stderr\": 0.028327743091561077\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.034139638059062345,\n\
\ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.034139638059062345\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6363636363636364,\n \"acc_stderr\": 0.03427308652999934,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03427308652999934\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7253886010362695,\n \"acc_stderr\": 0.03221024508041153,\n\
\ \"acc_norm\": 0.7253886010362695,\n \"acc_norm_stderr\": 0.03221024508041153\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4948717948717949,\n \"acc_stderr\": 0.02534967290683866,\n \
\ \"acc_norm\": 0.4948717948717949,\n \"acc_norm_stderr\": 0.02534967290683866\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275805,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275805\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5126050420168067,\n \"acc_stderr\": 0.03246816765752174,\n \
\ \"acc_norm\": 0.5126050420168067,\n \"acc_norm_stderr\": 0.03246816765752174\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7211009174311926,\n \"acc_stderr\": 0.0192274688764635,\n \"acc_norm\"\
: 0.7211009174311926,\n \"acc_norm_stderr\": 0.0192274688764635\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.03362277436608043,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.03362277436608043\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.03228210387037892,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.03228210387037892\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5515695067264574,\n\
\ \"acc_stderr\": 0.03337883736255098,\n \"acc_norm\": 0.5515695067264574,\n\
\ \"acc_norm_stderr\": 0.03337883736255098\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5398773006134969,\n \"acc_stderr\": 0.03915857291436971,\n\
\ \"acc_norm\": 0.5398773006134969,\n \"acc_norm_stderr\": 0.03915857291436971\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7649572649572649,\n\
\ \"acc_stderr\": 0.027778835904935434,\n \"acc_norm\": 0.7649572649572649,\n\
\ \"acc_norm_stderr\": 0.027778835904935434\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.016328814422102052,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.016328814422102052\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.026680134761679214,\n\
\ \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.026680134761679214\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2547486033519553,\n\
\ \"acc_stderr\": 0.014572650383409155,\n \"acc_norm\": 0.2547486033519553,\n\
\ \"acc_norm_stderr\": 0.014572650383409155\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5359477124183006,\n \"acc_stderr\": 0.02855582751652878,\n\
\ \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.02855582751652878\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5819935691318328,\n\
\ \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.5819935691318328,\n\
\ \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5524691358024691,\n \"acc_stderr\": 0.027667138569422704,\n\
\ \"acc_norm\": 0.5524691358024691,\n \"acc_norm_stderr\": 0.027667138569422704\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347243,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347243\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3820078226857888,\n\
\ \"acc_stderr\": 0.012409564470235567,\n \"acc_norm\": 0.3820078226857888,\n\
\ \"acc_norm_stderr\": 0.012409564470235567\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.48161764705882354,\n \"acc_stderr\": 0.030352303395351964,\n\
\ \"acc_norm\": 0.48161764705882354,\n \"acc_norm_stderr\": 0.030352303395351964\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4820261437908497,\n \"acc_stderr\": 0.020214761037872404,\n \
\ \"acc_norm\": 0.4820261437908497,\n \"acc_norm_stderr\": 0.020214761037872404\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6040816326530613,\n \"acc_stderr\": 0.03130802899065686,\n\
\ \"acc_norm\": 0.6040816326530613,\n \"acc_norm_stderr\": 0.03130802899065686\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5373134328358209,\n\
\ \"acc_stderr\": 0.035256751674679745,\n \"acc_norm\": 0.5373134328358209,\n\
\ \"acc_norm_stderr\": 0.035256751674679745\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5344202990692574,\n\
\ \"mc2_stderr\": 0.015729161957393895\n }\n}\n```"
repo_url: https://huggingface.co/xxyyy123/10k_v1_lora_qkvo_rank14_v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|arc:challenge|25_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hellaswag|10_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T13:17:02.987872.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T13:17:02.987872.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T13:17:02.987872.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T13:17:02.987872.parquet'
- config_name: results
data_files:
- split: 2023_09_03T13_17_02.987872
path:
- results_2023-09-03T13:17:02.987872.parquet
- split: latest
path:
- results_2023-09-03T13:17:02.987872.parquet
---
# Dataset Card for Evaluation run of xxyyy123/10k_v1_lora_qkvo_rank14_v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xxyyy123/10k_v1_lora_qkvo_rank14_v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xxyyy123/10k_v1_lora_qkvo_rank14_v3](https://huggingface.co/xxyyy123/10k_v1_lora_qkvo_rank14_v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank14_v3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-03T13:17:02.987872](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__10k_v1_lora_qkvo_rank14_v3/blob/main/results_2023-09-03T13%3A17%3A02.987872.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5091352266849982,
"acc_stderr": 0.03495474191892426,
"acc_norm": 0.5128128131582483,
"acc_norm_stderr": 0.03493935725866389,
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5344202990692574,
"mc2_stderr": 0.015729161957393895
},
"harness|arc:challenge|25": {
"acc": 0.5298634812286689,
"acc_stderr": 0.014585305840007105,
"acc_norm": 0.5597269624573379,
"acc_norm_stderr": 0.01450676952480424
},
"harness|hellaswag|10": {
"acc": 0.6050587532364071,
"acc_stderr": 0.004878390226591715,
"acc_norm": 0.7921728739294961,
"acc_norm_stderr": 0.00404923158643323
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5,
"acc_stderr": 0.04068942293855797,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04068942293855797
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5584905660377358,
"acc_stderr": 0.030561590426731833,
"acc_norm": 0.5584905660377358,
"acc_norm_stderr": 0.030561590426731833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.041614023984032786,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.041614023984032786
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.45664739884393063,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.45664739884393063,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.023456037383982022,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.023456037383982022
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235173,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5451612903225806,
"acc_stderr": 0.028327743091561077,
"acc_norm": 0.5451612903225806,
"acc_norm_stderr": 0.028327743091561077
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.034139638059062345,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.034139638059062345
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03427308652999934,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03427308652999934
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7253886010362695,
"acc_stderr": 0.03221024508041153,
"acc_norm": 0.7253886010362695,
"acc_norm_stderr": 0.03221024508041153
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4948717948717949,
"acc_stderr": 0.02534967290683866,
"acc_norm": 0.4948717948717949,
"acc_norm_stderr": 0.02534967290683866
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275805,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275805
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5126050420168067,
"acc_stderr": 0.03246816765752174,
"acc_norm": 0.5126050420168067,
"acc_norm_stderr": 0.03246816765752174
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7211009174311926,
"acc_stderr": 0.0192274688764635,
"acc_norm": 0.7211009174311926,
"acc_norm_stderr": 0.0192274688764635
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.03228210387037892,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.03228210387037892
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5515695067264574,
"acc_stderr": 0.03337883736255098,
"acc_norm": 0.5515695067264574,
"acc_norm_stderr": 0.03337883736255098
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5398773006134969,
"acc_stderr": 0.03915857291436971,
"acc_norm": 0.5398773006134969,
"acc_norm_stderr": 0.03915857291436971
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7649572649572649,
"acc_stderr": 0.027778835904935434,
"acc_norm": 0.7649572649572649,
"acc_norm_stderr": 0.027778835904935434
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.016328814422102052,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.016328814422102052
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.026680134761679214,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.026680134761679214
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2547486033519553,
"acc_stderr": 0.014572650383409155,
"acc_norm": 0.2547486033519553,
"acc_norm_stderr": 0.014572650383409155
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5359477124183006,
"acc_stderr": 0.02855582751652878,
"acc_norm": 0.5359477124183006,
"acc_norm_stderr": 0.02855582751652878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5819935691318328,
"acc_stderr": 0.028013651891995072,
"acc_norm": 0.5819935691318328,
"acc_norm_stderr": 0.028013651891995072
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5524691358024691,
"acc_stderr": 0.027667138569422704,
"acc_norm": 0.5524691358024691,
"acc_norm_stderr": 0.027667138569422704
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.028782227561347243,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.028782227561347243
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3820078226857888,
"acc_stderr": 0.012409564470235567,
"acc_norm": 0.3820078226857888,
"acc_norm_stderr": 0.012409564470235567
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.48161764705882354,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.48161764705882354,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4820261437908497,
"acc_stderr": 0.020214761037872404,
"acc_norm": 0.4820261437908497,
"acc_norm_stderr": 0.020214761037872404
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6040816326530613,
"acc_stderr": 0.03130802899065686,
"acc_norm": 0.6040816326530613,
"acc_norm_stderr": 0.03130802899065686
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5373134328358209,
"acc_stderr": 0.035256751674679745,
"acc_norm": 0.5373134328358209,
"acc_norm_stderr": 0.035256751674679745
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5344202990692574,
"mc2_stderr": 0.015729161957393895
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
passionMan/usda_tokenized_target | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 541970
num_examples: 2527
- name: test
num_bytes: 180736
num_examples: 843
download_size: 136249
dataset_size: 722706
---
# Dataset Card for "usda_tokenized_target"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RUCAIBox/Erya-dataset | ---
license: apache-2.0
task_categories:
- translation
- text-generation
---
**monolingual.tgz** contains a vast collection of Ancient Chinese sentences.
**trans.tgz** combines vairous types of parallel corpora, with Ancient Chinese sentences aligned with Modern Chinese sentences.
**finetune.tgz** selects certain classical books as benchmarks, with Ancient Chinese sentences aligned with Modern Chinese sentences.
The data in **trans.tgz** and **finetune.tgz** does not overlap. More information can be found here [RUCAIBox/Erya (github.com)](https://github.com/RUCAIBox/Erya). |
semiotic/spider_dataset_tuning | ---
dataset_info:
features:
- name: type
dtype: string
- name: question
dtype: string
- name: query
dtype: string
- name: db_id
dtype: string
- name: schema
dtype: string
splits:
- name: train
num_bytes: 125169641
num_examples: 97317
- name: val
num_bytes: 10757137
num_examples: 7909
- name: test
num_bytes: 1384246
num_examples: 1292
download_size: 7245840
dataset_size: 137311024
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
liuyanchen1015/MULTI_VALUE_wnli_you_ye | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 1605
num_examples: 7
- name: train
num_bytes: 849
num_examples: 6
download_size: 6423
dataset_size: 2454
---
# Dataset Card for "MULTI_VALUE_wnli_you_ye"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kanishka/counterfactual_training_test | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 79390
num_examples: 1000
- name: validation
num_bytes: 56120230
num_examples: 1026747
download_size: 0
dataset_size: 56199620
---
# Dataset Card for "counterfactual_training_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dahwinsingularity/dahyun-v-l-0 | ---
license: apache-2.0
---
|
intfloat/query2doc_msmarco | ---
license: cc-by-4.0
language:
- en
size_categories:
- 100K<n<1M
---
### Dataset Summary
This dataset contains GPT-3.5 (`text-davinci-003`) generations from MS-MARCO queries.
[Query2doc: Query Expansion with Large Language Models](https://arxiv.org/pdf/2303.07678.pdf) Liang Wang, Nan Yang and Furu Wei
### Data Instances
An example looks as follows.
```
{
"query_id": "1030303",
"query": "who is aziz hashim",
"pseudo_doc": "Aziz Hashim is a renowned entrepreneur, business leader, and one of the most successful restaurant franchise operators in the US. He is the founder of NRD Capital, a private equity firm focused on investments in multi-unit restaurant franchised businesses. Hashim has built a formidable track record of success in the franchise industry, with brands such as Outback Steakhouse and Jamba Juice. His accomplishments and philanthropic initiatives have earned him numerous awards, including the prestigious Ernst and Young Entrepreneur of the Year award."
}
```
### Data Fields
- `query_id`: a `string` feature.
- `query`: a `string` feature.
- `pseudo_doc`: a `string` feature.
### Data Splits
| train | dev | test | trec_dl2019 | trec_dl2020 |
|--------|------:|------:|------:|------:|
| 502939 | 6980 | 6837 | 43 | 54 |
### How to use this dataset
```python
from datasets import load_dataset
dataset = load_dataset('intfloat/query2doc_msmarco')
print(dataset['trec_dl2019'][0])
```
### Reproducing our results
We provide a python script [repro_bm25.py](https://huggingface.co/datasets/intfloat/query2doc_msmarco/blob/main/repro_bm25.py) to reproduce our results with BM25 retrieval.
First install some python dependency packages:
```
pip install pyserini==0.15.0 pytrec_eval datasets tqdm
```
Then download and run the python code:
```
python repro_bm25.py
```
This script utilizes the pre-built Lucene index from [Pyserini](https://github.com/castorini/pyserini/blob/pyserini-0.15.0/docs/prebuilt-indexes.md)
and might yield slightly different results compared to the paper.
### Citation Information
```
@article{wang2023query2doc,
title={Query2doc: Query Expansion with Large Language Models},
author={Wang, Liang and Yang, Nan and Wei, Furu},
journal={arXiv preprint arXiv:2303.07678},
year={2023}
}
```
|
mespinosami/map2sat-central-belt-clarity-old-map20-samples | ---
dataset_info:
features:
- name: input_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 857306.8
num_examples: 16
- name: test
num_bytes: 201058.2
num_examples: 4
download_size: 1061836
dataset_size: 1058365.0
---
# Dataset Card for "map2sat-central-belt-clarity-old-map20-samples"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ricahrd/McKevin | ---
license: openrail
---
|
harpreetsahota/Instruction-Following-Evaluation-for-Large-Language-Models | ---
dataset_info:
features:
- name: key
dtype: int64
- name: prompt
dtype: string
- name: instruction_id_list
sequence: string
- name: kwargs
dtype: string
splits:
- name: train
num_bytes: 181824
num_examples: 541
download_size: 80840
dataset_size: 181824
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Instruction-Following Evaluation Dataset
## 📜 Overview
This dataset, specifically designed for the **evaluation of large language models in instruction-following tasks**, is directly inspired by the methodologies and experiments described in the paper titled _"Instruction-Following Evaluation for Large Language Models"_. The dataset's creation and availability on HuggingFace are aimed at enhancing research and application in the field of natural language understanding, particularly in the context of instruction interpretation and execution by AI models.
## 🌐 Source
The dataset draws its structure and content from the insights provided in:
- **Original Research Paper**: [_"Instruction-Following Evaluation for Large Language Models"_](https://arxiv.org/abs/2311.07911)
- **Original Data Repository**: [Google Research on GitHub](https://github.com/google-research/google-research/tree/master/instruction_following_eval)
## 📊 Dataset Structure
Comprising primarily of **'prompts'**, this dataset is tailored to challenge and assess language models on various facets of understanding and executing instructions. Each prompt represents a unique scenario or task, simulating real-world applications where accurate interpretation of instructions is crucial.
## 💡 Usage
Targeted for use within the **HuggingFace ecosystem**, this dataset serves as a pivotal tool for researchers and developers focusing on the advancement of language models. It stands as a benchmark for:
- 📈 Evaluating model performance in instruction-following tasks.
- 🔍 Identifying model capabilities and areas of improvement.
- 🤖 Enhancing AI's understanding of complex, human-like commands.
## 🙏 Acknowledgements
This dataset is a tribute to the foundational work presented in the original paper and is intended for academic and research purposes. It reflects a commitment to furthering the understanding of AI's interaction with human language, particularly in processing and responding to diverse and complex instructions.
|
open-llm-leaderboard/details_TFLai__gpt2-turkish-uncased | ---
pretty_name: Evaluation run of TFLai/gpt2-turkish-uncased
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/gpt2-turkish-uncased](https://huggingface.co/TFLai/gpt2-turkish-uncased)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__gpt2-turkish-uncased\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-02T15:29:40.186292](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__gpt2-turkish-uncased/blob/main/results_2023-12-02T15-29-40.186292.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"\
acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \
\ \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/TFLai/gpt2-turkish-uncased
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|arc:challenge|25_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T01_34_05.823968
path:
- '**/details_harness|drop|3_2023-10-22T01-34-05.823968.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T01-34-05.823968.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T01_34_05.823968
path:
- '**/details_harness|gsm8k|5_2023-10-22T01-34-05.823968.parquet'
- split: 2023_12_02T15_29_40.186292
path:
- '**/details_harness|gsm8k|5_2023-12-02T15-29-40.186292.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-02T15-29-40.186292.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hellaswag|10_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T09:48:46.264649.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T09:48:46.264649.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T09:48:46.264649.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T01_34_05.823968
path:
- '**/details_harness|winogrande|5_2023-10-22T01-34-05.823968.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T01-34-05.823968.parquet'
- config_name: results
data_files:
- split: 2023_07_24T09_48_46.264649
path:
- results_2023-07-24T09:48:46.264649.parquet
- split: 2023_10_22T01_34_05.823968
path:
- results_2023-10-22T01-34-05.823968.parquet
- split: 2023_12_02T15_29_40.186292
path:
- results_2023-12-02T15-29-40.186292.parquet
- split: latest
path:
- results_2023-12-02T15-29-40.186292.parquet
---
# Dataset Card for Evaluation run of TFLai/gpt2-turkish-uncased
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/gpt2-turkish-uncased
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/gpt2-turkish-uncased](https://huggingface.co/TFLai/gpt2-turkish-uncased) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__gpt2-turkish-uncased",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T15:29:40.186292](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__gpt2-turkish-uncased/blob/main/results_2023-12-02T15-29-40.186292.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AdapterOcean/python3-standardized_cluster_16_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 3711623
num_examples: 3834
download_size: 762202
dataset_size: 3711623
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python3-standardized_cluster_16_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibupari/mini-learnilatypus | ---
dataset_info:
features:
- name: title
dtype: string
- name: subtitle
dtype: string
- name: paragraph
dtype: string
- name: sentences
dtype: string
splits:
- name: train
num_bytes: 1324745
num_examples: 1000
download_size: 264663
dataset_size: 1324745
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
humane-lab/K-HATERS | ---
license: cc-by-4.0
language:
- ko
pretty_name: K-Haters
tags:
- hate speech detection
---
<!--
# ℹ️ Dataset card for K-HATERS
### Dataset summary
We introduces **K-HATERS**, a new corpus for hate speech detection in Korean, comprising approximately 192K news comments with target-specific offensiveness ratings.
The corpus consists of 192,158 news comments consisting of 184,117 news comments collected by ourselves and 8,041 comments collected from a [previous study](https://aclanthology.org/2020.socialnlp-1.4/).
We collected news comments published through the politics, society and world news sections in Naver News over two months in 2021.
All comments were annotated through CashMission, a crowdsourcing service run by SELECTSTAR.
</br>For more information, please refer to the paper [K-HATERS](https://arxiv.org/abs/2310.15439) published at EMNLP 2023 Findings.
### Supported tasks
- Hate speech detection
- Multi class classification (labels: normal, offensive, L1_hate, L2_hate)
- Binary classifiction (labels: normal, toxic(offensive, L1_hate, L2_hate))
- Rationale prediction (offensiveness, target rationale)
### Data describtion
```
data['train'][42]
{'text': '군대도 안간 놈 이 주둥아리 는 씽씽하네..보수 놈 들..군대는 안가고 애국이냐..#@이름#,#@이름#,',
'label': 'L1_hate',
'target_label': ['political'],
'offensiveness_rationale': [[7, 8], [11, 15], [27, 28]],
'target_rationale': [[24, 26], [46, 51], [52, 57]]}
```
- Abusive language categories (**label**)
- L2_hate: Comments with explicit forms of hate expressions toward one of the groups of protected attributes (e.g., gender, age, race, ...)
- L1_hate: Comments with more implicit forms of hate expressions
- Offensive: Comments that express offensiveness but not toward a protected attribute group
- Normal: The rest comments
- Multi-label target categories (**target_label**): list of offensiveness targets. A comment can have zero or multiple targets.
- List of target categories: gender, age, race, religion, politics, job, disability, individuals, and others.
- Annotators' rationales for the strength of ratings (**offensiveness_rationale**): lists providing annotators' rationales for the strength of ratings. The list includes the start and end indices of highlight spans.
- Annotators' rationales for the target of offensiveness (**target_rationale**)
### Dataset split
We provide the dataset in the form of splits as 172,158 (for train), 10,000 (for validation), and 10,000 (for test). Label ratio was preseved (stratified split).
### Labeling guidelines
Labeling guidelines are available as a part of SELECTSTAR open datasets (in Korean). [link](https://open.selectstar.ai/ko/?page_id=5948)
</br>
# 📜 Data statement
We present the data statement for responsible usage [(Bender and Friedman, 2018)](https://aclanthology.org/Q18-1041/).
### Curation Rationale
We collected the raw data from the news aggregator of Naver, the largest news portal in Korea. We targeted news articles published in the society, world news, and politics sections because discussions are active in the hard news.
### Language Variety
Our dataset consists of the news comments in Korean (ko-KR).
### Speaker Demographic
The user demographic is not available. However, considering that the portal site has the largest share of Korean, it can be assumed that speakers are mostly Korean.
### Annotator Demographic
A total of 405 workers participated in an annotation. 21 workers are 10s, 222 workers are 20s, 116 workers are 30s, 35 workers are 40s, 9 workers are 50s, and 2 workers are 60s.
### Speech Situation
News article in the hard news section deals with controversial events, so there are more likely to exist hate comments or toxicity comments. The target articles were published between July 2021 and August 2021. During that period, the most controversial events were the South Korean presidential election, the Tokyo Olympics, COVID-19, and the Restoration of Taliban Control, etc.
### Text Characteristics
It includes hatred words limited to Korea, such as hatred of certain political orientations and certain groups. For example, '대깨문' (a word that hates former Korean president Moon's supporter), and '꼴페미' (a word that hates feminists)
</br>
# 🤝 License & Contributors
### Licensing information
This dataset is shared under CC-BY 4.0.
</br>According to this license, you are free to use the dataset as long as you provide appropriate attribution (e.g., citing our paper).
### Citation information
```
@article{park2023haters,
title={K-HATERS: A Hate Speech Detection Corpus in Korean with Target-Specific Ratings},
author={Park, Chaewon and Kim, Suhwan and Park, Kyubyong and Park, Kunwoo},
journal={Findings of the EMNLP 2023},
year={2023}
}
```
### Contributions
- Chaewon Park
- Suhwan Kim (TUNiB)
- Kyubyong Park (TUNiB)
- Kunwoo Park
#--> |
open-llm-leaderboard/details_Sharathhebbar24__ssh_1.8B | ---
pretty_name: Evaluation run of Sharathhebbar24/ssh_1.8B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sharathhebbar24/ssh_1.8B](https://huggingface.co/Sharathhebbar24/ssh_1.8B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sharathhebbar24__ssh_1.8B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-03T16:03:37.862164](https://huggingface.co/datasets/open-llm-leaderboard/details_Sharathhebbar24__ssh_1.8B/blob/main/results_2024-02-03T16-03-37.862164.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4400975999737303,\n\
\ \"acc_stderr\": 0.0345967614345703,\n \"acc_norm\": 0.4431186866947614,\n\
\ \"acc_norm_stderr\": 0.03532660922667111,\n \"mc1\": 0.2668298653610771,\n\
\ \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.4314996062576424,\n\
\ \"mc2_stderr\": 0.015306262833109105\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3728668941979522,\n \"acc_stderr\": 0.014131176760131165,\n\
\ \"acc_norm\": 0.39078498293515357,\n \"acc_norm_stderr\": 0.014258563880513778\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.47560246962756425,\n\
\ \"acc_stderr\": 0.004983837641502896,\n \"acc_norm\": 0.6236805417247561,\n\
\ \"acc_norm_stderr\": 0.00483471581420811\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5056603773584906,\n \"acc_stderr\": 0.030770900763851316,\n\
\ \"acc_norm\": 0.5056603773584906,\n \"acc_norm_stderr\": 0.030770900763851316\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.041227287076512825,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.041227287076512825\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3872832369942196,\n\
\ \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.3872832369942196,\n\
\ \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.040233822736177476,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.040233822736177476\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.032555253593403555,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.032555253593403555\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523867,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523867\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.028444006199428714,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.028444006199428714\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33004926108374383,\n \"acc_stderr\": 0.033085304262282574,\n\
\ \"acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.033085304262282574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5878787878787879,\n \"acc_stderr\": 0.03843566993588717,\n\
\ \"acc_norm\": 0.5878787878787879,\n \"acc_norm_stderr\": 0.03843566993588717\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.035623524993954825,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.035623524993954825\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.6113989637305699,\n \"acc_stderr\": 0.035177397963731316,\n\
\ \"acc_norm\": 0.6113989637305699,\n \"acc_norm_stderr\": 0.035177397963731316\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.40512820512820513,\n \"acc_stderr\": 0.024890471769938145,\n\
\ \"acc_norm\": 0.40512820512820513,\n \"acc_norm_stderr\": 0.024890471769938145\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066482,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066482\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.46638655462184875,\n \"acc_stderr\": 0.03240501447690071,\n\
\ \"acc_norm\": 0.46638655462184875,\n \"acc_norm_stderr\": 0.03240501447690071\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5467889908256881,\n \"acc_stderr\": 0.021343255165546037,\n \"\
acc_norm\": 0.5467889908256881,\n \"acc_norm_stderr\": 0.021343255165546037\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.35648148148148145,\n \"acc_stderr\": 0.032664783315272714,\n \"\
acc_norm\": 0.35648148148148145,\n \"acc_norm_stderr\": 0.032664783315272714\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.49019607843137253,\n \"acc_stderr\": 0.03508637358630572,\n \"\
acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.03508637358630572\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5316455696202531,\n \"acc_stderr\": 0.032481974005110756,\n \
\ \"acc_norm\": 0.5316455696202531,\n \"acc_norm_stderr\": 0.032481974005110756\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4977578475336323,\n\
\ \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.4977578475336323,\n\
\ \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4732824427480916,\n \"acc_stderr\": 0.04379024936553893,\n\
\ \"acc_norm\": 0.4732824427480916,\n \"acc_norm_stderr\": 0.04379024936553893\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.628099173553719,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5648148148148148,\n\
\ \"acc_stderr\": 0.04792898170907062,\n \"acc_norm\": 0.5648148148148148,\n\
\ \"acc_norm_stderr\": 0.04792898170907062\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4723926380368098,\n \"acc_stderr\": 0.039223782906109894,\n\
\ \"acc_norm\": 0.4723926380368098,\n \"acc_norm_stderr\": 0.039223782906109894\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6581196581196581,\n\
\ \"acc_stderr\": 0.03107502852650775,\n \"acc_norm\": 0.6581196581196581,\n\
\ \"acc_norm_stderr\": 0.03107502852650775\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5644955300127714,\n\
\ \"acc_stderr\": 0.017730589927926588,\n \"acc_norm\": 0.5644955300127714,\n\
\ \"acc_norm_stderr\": 0.017730589927926588\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5115606936416185,\n \"acc_stderr\": 0.026911898686377927,\n\
\ \"acc_norm\": 0.5115606936416185,\n \"acc_norm_stderr\": 0.026911898686377927\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767857,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767857\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5359477124183006,\n \"acc_stderr\": 0.028555827516528787,\n\
\ \"acc_norm\": 0.5359477124183006,\n \"acc_norm_stderr\": 0.028555827516528787\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.45016077170418006,\n\
\ \"acc_stderr\": 0.028256660723360184,\n \"acc_norm\": 0.45016077170418006,\n\
\ \"acc_norm_stderr\": 0.028256660723360184\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.027744313443376536,\n\
\ \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.027744313443376536\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3404255319148936,\n \"acc_stderr\": 0.028267657482650144,\n \
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.028267657482650144\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34485006518904826,\n\
\ \"acc_stderr\": 0.012139881006287058,\n \"acc_norm\": 0.34485006518904826,\n\
\ \"acc_norm_stderr\": 0.012139881006287058\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3786764705882353,\n \"acc_stderr\": 0.029465133639776132,\n\
\ \"acc_norm\": 0.3786764705882353,\n \"acc_norm_stderr\": 0.029465133639776132\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4084967320261438,\n \"acc_stderr\": 0.01988622103750188,\n \
\ \"acc_norm\": 0.4084967320261438,\n \"acc_norm_stderr\": 0.01988622103750188\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.03198761546763127,\n\
\ \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.03198761546763127\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3781094527363184,\n\
\ \"acc_stderr\": 0.03428867848778658,\n \"acc_norm\": 0.3781094527363184,\n\
\ \"acc_norm_stderr\": 0.03428867848778658\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3795180722891566,\n\
\ \"acc_stderr\": 0.037777988227480165,\n \"acc_norm\": 0.3795180722891566,\n\
\ \"acc_norm_stderr\": 0.037777988227480165\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5146198830409356,\n \"acc_stderr\": 0.038331852752130254,\n\
\ \"acc_norm\": 0.5146198830409356,\n \"acc_norm_stderr\": 0.038331852752130254\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2668298653610771,\n\
\ \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.4314996062576424,\n\
\ \"mc2_stderr\": 0.015306262833109105\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5927387529597474,\n \"acc_stderr\": 0.013808654122417848\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.27520849128127367,\n \
\ \"acc_stderr\": 0.012302114305862647\n }\n}\n```"
repo_url: https://huggingface.co/Sharathhebbar24/ssh_1.8B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|arc:challenge|25_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|gsm8k|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hellaswag|10_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T16-03-37.862164.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-03T16-03-37.862164.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- '**/details_harness|winogrande|5_2024-02-03T16-03-37.862164.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-03T16-03-37.862164.parquet'
- config_name: results
data_files:
- split: 2024_02_03T16_03_37.862164
path:
- results_2024-02-03T16-03-37.862164.parquet
- split: latest
path:
- results_2024-02-03T16-03-37.862164.parquet
---
# Dataset Card for Evaluation run of Sharathhebbar24/ssh_1.8B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Sharathhebbar24/ssh_1.8B](https://huggingface.co/Sharathhebbar24/ssh_1.8B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sharathhebbar24__ssh_1.8B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T16:03:37.862164](https://huggingface.co/datasets/open-llm-leaderboard/details_Sharathhebbar24__ssh_1.8B/blob/main/results_2024-02-03T16-03-37.862164.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4400975999737303,
"acc_stderr": 0.0345967614345703,
"acc_norm": 0.4431186866947614,
"acc_norm_stderr": 0.03532660922667111,
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237265,
"mc2": 0.4314996062576424,
"mc2_stderr": 0.015306262833109105
},
"harness|arc:challenge|25": {
"acc": 0.3728668941979522,
"acc_stderr": 0.014131176760131165,
"acc_norm": 0.39078498293515357,
"acc_norm_stderr": 0.014258563880513778
},
"harness|hellaswag|10": {
"acc": 0.47560246962756425,
"acc_stderr": 0.004983837641502896,
"acc_norm": 0.6236805417247561,
"acc_norm_stderr": 0.00483471581420811
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5056603773584906,
"acc_stderr": 0.030770900763851316,
"acc_norm": 0.5056603773584906,
"acc_norm_stderr": 0.030770900763851316
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.041227287076512825,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.041227287076512825
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.037143259063020656,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.037143259063020656
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.040233822736177476,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.040233822736177476
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.032555253593403555,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.032555253593403555
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523867,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523867
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5,
"acc_stderr": 0.028444006199428714,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028444006199428714
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5878787878787879,
"acc_stderr": 0.03843566993588717,
"acc_norm": 0.5878787878787879,
"acc_norm_stderr": 0.03843566993588717
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5,
"acc_stderr": 0.035623524993954825,
"acc_norm": 0.5,
"acc_norm_stderr": 0.035623524993954825
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6113989637305699,
"acc_stderr": 0.035177397963731316,
"acc_norm": 0.6113989637305699,
"acc_norm_stderr": 0.035177397963731316
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.40512820512820513,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.40512820512820513,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066482,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.46638655462184875,
"acc_stderr": 0.03240501447690071,
"acc_norm": 0.46638655462184875,
"acc_norm_stderr": 0.03240501447690071
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5467889908256881,
"acc_stderr": 0.021343255165546037,
"acc_norm": 0.5467889908256881,
"acc_norm_stderr": 0.021343255165546037
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35648148148148145,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.35648148148148145,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.03508637358630572,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.03508637358630572
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5316455696202531,
"acc_stderr": 0.032481974005110756,
"acc_norm": 0.5316455696202531,
"acc_norm_stderr": 0.032481974005110756
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4977578475336323,
"acc_stderr": 0.033557465352232634,
"acc_norm": 0.4977578475336323,
"acc_norm_stderr": 0.033557465352232634
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4732824427480916,
"acc_stderr": 0.04379024936553893,
"acc_norm": 0.4732824427480916,
"acc_norm_stderr": 0.04379024936553893
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.04792898170907062,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.04792898170907062
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4723926380368098,
"acc_stderr": 0.039223782906109894,
"acc_norm": 0.4723926380368098,
"acc_norm_stderr": 0.039223782906109894
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915185,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915185
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6581196581196581,
"acc_stderr": 0.03107502852650775,
"acc_norm": 0.6581196581196581,
"acc_norm_stderr": 0.03107502852650775
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5644955300127714,
"acc_stderr": 0.017730589927926588,
"acc_norm": 0.5644955300127714,
"acc_norm_stderr": 0.017730589927926588
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5115606936416185,
"acc_stderr": 0.026911898686377927,
"acc_norm": 0.5115606936416185,
"acc_norm_stderr": 0.026911898686377927
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767857,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767857
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5359477124183006,
"acc_stderr": 0.028555827516528787,
"acc_norm": 0.5359477124183006,
"acc_norm_stderr": 0.028555827516528787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.45016077170418006,
"acc_stderr": 0.028256660723360184,
"acc_norm": 0.45016077170418006,
"acc_norm_stderr": 0.028256660723360184
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.027744313443376536,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.027744313443376536
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.028267657482650144,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.028267657482650144
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34485006518904826,
"acc_stderr": 0.012139881006287058,
"acc_norm": 0.34485006518904826,
"acc_norm_stderr": 0.012139881006287058
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3786764705882353,
"acc_stderr": 0.029465133639776132,
"acc_norm": 0.3786764705882353,
"acc_norm_stderr": 0.029465133639776132
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4084967320261438,
"acc_stderr": 0.01988622103750188,
"acc_norm": 0.4084967320261438,
"acc_norm_stderr": 0.01988622103750188
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5183673469387755,
"acc_stderr": 0.03198761546763127,
"acc_norm": 0.5183673469387755,
"acc_norm_stderr": 0.03198761546763127
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3781094527363184,
"acc_stderr": 0.03428867848778658,
"acc_norm": 0.3781094527363184,
"acc_norm_stderr": 0.03428867848778658
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3795180722891566,
"acc_stderr": 0.037777988227480165,
"acc_norm": 0.3795180722891566,
"acc_norm_stderr": 0.037777988227480165
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5146198830409356,
"acc_stderr": 0.038331852752130254,
"acc_norm": 0.5146198830409356,
"acc_norm_stderr": 0.038331852752130254
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237265,
"mc2": 0.4314996062576424,
"mc2_stderr": 0.015306262833109105
},
"harness|winogrande|5": {
"acc": 0.5927387529597474,
"acc_stderr": 0.013808654122417848
},
"harness|gsm8k|5": {
"acc": 0.27520849128127367,
"acc_stderr": 0.012302114305862647
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
FUBUKIBG/rosepronto | ---
license: openrail
---
|
heliosprime/twitter_dataset_1713204665 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 12837
num_examples: 36
download_size: 14403
dataset_size: 12837
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713204665"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_160m_bo2_100_kl_0.1_prm_160m_thr_0.3_seed_3 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: index
dtype: int64
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43586042
num_examples: 18929
- name: epoch_1
num_bytes: 44122478
num_examples: 18929
- name: epoch_2
num_bytes: 44219397
num_examples: 18929
- name: epoch_3
num_bytes: 44261340
num_examples: 18929
- name: epoch_4
num_bytes: 44282722
num_examples: 18929
- name: epoch_5
num_bytes: 44300108
num_examples: 18929
- name: epoch_6
num_bytes: 44310569
num_examples: 18929
- name: epoch_7
num_bytes: 44316296
num_examples: 18929
- name: epoch_8
num_bytes: 44323129
num_examples: 18929
- name: epoch_9
num_bytes: 44326498
num_examples: 18929
- name: epoch_10
num_bytes: 44326077
num_examples: 18929
- name: epoch_11
num_bytes: 44327725
num_examples: 18929
- name: epoch_12
num_bytes: 44328350
num_examples: 18929
- name: epoch_13
num_bytes: 44330594
num_examples: 18929
- name: epoch_14
num_bytes: 44330360
num_examples: 18929
- name: epoch_15
num_bytes: 44332404
num_examples: 18929
- name: epoch_16
num_bytes: 44331677
num_examples: 18929
- name: epoch_17
num_bytes: 44332499
num_examples: 18929
- name: epoch_18
num_bytes: 44332572
num_examples: 18929
- name: epoch_19
num_bytes: 44334032
num_examples: 18929
- name: epoch_20
num_bytes: 44334167
num_examples: 18929
- name: epoch_21
num_bytes: 44333390
num_examples: 18929
- name: epoch_22
num_bytes: 44335365
num_examples: 18929
- name: epoch_23
num_bytes: 44334419
num_examples: 18929
- name: epoch_24
num_bytes: 44334230
num_examples: 18929
- name: epoch_25
num_bytes: 44333923
num_examples: 18929
- name: epoch_26
num_bytes: 44333784
num_examples: 18929
- name: epoch_27
num_bytes: 44334765
num_examples: 18929
- name: epoch_28
num_bytes: 44334889
num_examples: 18929
- name: epoch_29
num_bytes: 44334778
num_examples: 18929
download_size: 699687675
dataset_size: 1328698579
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
liuyanchen1015/MULTI_VALUE_wnli_indefinite_for_zero | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 12594
num_examples: 65
- name: test
num_bytes: 40930
num_examples: 144
- name: train
num_bytes: 115381
num_examples: 604
download_size: 62465
dataset_size: 168905
---
# Dataset Card for "MULTI_VALUE_wnli_indefinite_for_zero"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MatsuoDochiai/LUISA | ---
license: openrail
---
|
FidelOdok/SOFA_DOA | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
'5': '5'
'6': '6'
splits:
- name: train
num_bytes: 21491814313.0
num_examples: 22500
download_size: 21492710615
dataset_size: 21491814313.0
---
# Dataset Card for "SOFA_DOA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/tachibana_arisu_theidolmastercinderellagirlsu149 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Tachibana Arisu
This is the dataset of Tachibana Arisu, containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 486 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 486 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 486 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 486 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
liuyanchen1015/MULTI_VALUE_rte_inverted_indirect_question | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 28900
num_examples: 53
- name: train
num_bytes: 19696
num_examples: 41
download_size: 43113
dataset_size: 48596
---
# Dataset Card for "MULTI_VALUE_rte_inverted_indirect_question"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pierre-pessarossi/climate-question-answers | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 2467048
num_examples: 7033
- name: test
num_bytes: 622679
num_examples: 1758
download_size: 1892524
dataset_size: 3089727
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
license: mit
task_categories:
- question-answering
language:
- en
tags:
- climate
pretty_name: Climate change questions/answers
size_categories:
- 1K<n<10K
---
Dataset Card for Climate change questions / answers dataset
Dataset Description\
This is a first version of a question/answer dataset on climate change and ecology.
The dataset has been created based on a curated list of wikipedia articles on climate change from https://huggingface.co/datasets/pierre-pessarossi/wikipedia-climate-data
For each wikipedia article of the original dataset, a set of question/answers pairs was created. The number of question depends on the initial size of the wikipedia article.
Currently, there are only question / answer pairs (i.e. no chat mode with several messages).
Open-mixtral-8x7b was used to generate the question and answers.
The dataset can be useful for supervised fine-tuning on the topic of climate change.\
In forthcoming releases the dataset will be expanded in length and different LLMs might be used to generate the question / answer. |
bclavie/mmarco-japanese-hard-negatives | ---
language:
- ja
task_categories:
- text-retrieval
dataset_info:
features:
- name: query
dtype: string
- name: positives
sequence: string
- name: negatives
sequence: string
- name: bm25_negatives
sequence: string
- name: original_negatives
sequence: string
splits:
- name: train
num_bytes: 24494938913
num_examples: 391061
download_size: 11664534369
dataset_size: 24494938913
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
[Under Construction]
This is a repository containing all the queries from the Japanese part of the MMarco dataset, the multilingual version of the MSMarco dataset.
For each query, there are matching hard negatives:
- 25 of them retrieved by the multilingual e5 base model.
- Up to 10 of them retrieved by the basic implementation of BM25 from Japanese in the Anserini library. |
cacheop/red-right-hand | ---
license: other
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 711683
num_examples: 315
download_size: 338831
dataset_size: 711683
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vishal323/heart | ---
license: openrail
---
|
CyberHarem/saori_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of saori/錠前サオリ/纱织 (Blue Archive)
This is the dataset of saori/錠前サオリ/纱织 (Blue Archive), containing 500 images and their tags.
The core tags of this character are `long_hair, breasts, halo, blue_eyes, blue_hair, black_hair, large_breasts, multicolored_hair, baseball_cap, hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.11 GiB | [Download](https://huggingface.co/datasets/CyberHarem/saori_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 904.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/saori_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1354 | 1.80 GiB | [Download](https://huggingface.co/datasets/CyberHarem/saori_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/saori_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, bare_shoulders, black_headwear, black_pants, black_shirt, crop_top, holding_gun, midriff, navel, sig_sauer, sleeveless_shirt, solo, stomach, white_coat, assault_rifle, black_belt, off_shoulder, open_coat, black_gloves, looking_at_viewer, long_sleeves, mouth_mask, standing, leggings, buckle, armband, cowboy_shot |
| 1 | 7 |  |  |  |  |  | 1girl, bare_shoulders, black_headwear, black_pants, black_shirt, crop_top, looking_at_viewer, midriff, navel, off_shoulder, simple_background, sleeveless_shirt, solo, stomach, white_coat, cowboy_shot, standing, white_background, leggings, mouth_mask, open_coat, long_sleeves, black_belt |
| 2 | 11 |  |  |  |  |  | 1girl, black_pants, black_shirt, crop_top, long_sleeves, looking_at_viewer, midriff, mouth_mask, navel, sleeveless_shirt, solo, stomach, white_coat, bare_shoulders, black_belt, off_shoulder, open_coat, black_gloves, black_headwear, black_mask, cowboy_shot, standing, jacket, chest_harness, snap-fit_buckle, groin, holding_mask, medium_breasts, underbust, unworn_mask |
| 3 | 7 |  |  |  |  |  | 1girl, alternate_costume, black_bikini, navel, solo, stomach, cleavage, closed_mouth, collarbone, looking_at_viewer, thighs, bare_shoulders, blush, wet, black_headwear, cowboy_shot, outdoors, sideboob, string_bikini, long_sleeves, open_clothes, skindentation, standing, underboob, wading, water |
| 4 | 63 |  |  |  |  |  | white_dress, 1girl, alternate_costume, elbow_gloves, solo, white_gloves, cleavage, bare_shoulders, white_choker, looking_at_viewer, earrings, collarbone, colored_inner_hair, simple_background, white_background, closed_mouth, blush, blue_halo, strapless_dress, covered_navel, hair_flower |
| 5 | 27 |  |  |  |  |  | 1girl, elbow_gloves, looking_at_viewer, solo, white_dress, white_gloves, alternate_costume, ass, bare_shoulders, white_background, ponytail, simple_background, from_behind, looking_back, weapon, white_thighhighs, thigh_holster, blush, closed_mouth, knife, garter_straps, thigh_strap, colored_inner_hair |
| 6 | 5 |  |  |  |  |  | 1girl, alternate_costume, bare_shoulders, elbow_gloves, holding_gun, looking_at_viewer, ponytail, solo, white_dress, white_gloves, from_behind, looking_back, thigh_holster, simple_background, thigh_strap, white_background, assault_rifle, closed_mouth, handgun, mask, side_slit |
| 7 | 8 |  |  |  |  |  | 1boy, 1girl, hetero, penis, pussy, solo_focus, vaginal, anus, looking_at_viewer, looking_back, sex_from_behind, blush, pov, sweat, black_shirt, girl_on_top, indoors, reverse_cowgirl_position, sleeveless_shirt, bare_shoulders, mosaic_censoring, shirt_lift, all_fours, ass_focus, backboob, bottomless, completely_nude, dark-skinned_male, doggystyle, uncensored |
| 8 | 12 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, pussy, sex, solo_focus, vaginal, penis, looking_at_viewer, navel, censored, colored_inner_hair, pov, sweat, completely_nude, female_pubic_hair, choker, hair_ornament, missionary, on_back, spread_legs, elbow_gloves, open_mouth, white_gloves, white_thighhighs |
| 9 | 6 |  |  |  |  |  | 1girl, blush, gangbang, hetero, mosaic_censoring, penis, solo_focus, 3boys, erection, fellatio, nipples, sweat, completely_nude, testicles, cum, double_handjob, male_pubic_hair, navel, pussy, vaginal, veins |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_headwear | black_pants | black_shirt | crop_top | holding_gun | midriff | navel | sig_sauer | sleeveless_shirt | solo | stomach | white_coat | assault_rifle | black_belt | off_shoulder | open_coat | black_gloves | looking_at_viewer | long_sleeves | mouth_mask | standing | leggings | buckle | armband | cowboy_shot | simple_background | white_background | black_mask | jacket | chest_harness | snap-fit_buckle | groin | holding_mask | medium_breasts | underbust | unworn_mask | alternate_costume | black_bikini | cleavage | closed_mouth | collarbone | thighs | blush | wet | outdoors | sideboob | string_bikini | open_clothes | skindentation | underboob | wading | water | white_dress | elbow_gloves | white_gloves | white_choker | earrings | colored_inner_hair | blue_halo | strapless_dress | covered_navel | hair_flower | ass | ponytail | from_behind | looking_back | weapon | white_thighhighs | thigh_holster | knife | garter_straps | thigh_strap | handgun | mask | side_slit | 1boy | hetero | penis | pussy | solo_focus | vaginal | anus | sex_from_behind | pov | sweat | girl_on_top | indoors | reverse_cowgirl_position | mosaic_censoring | shirt_lift | all_fours | ass_focus | backboob | bottomless | completely_nude | dark-skinned_male | doggystyle | uncensored | nipples | sex | censored | female_pubic_hair | choker | hair_ornament | missionary | on_back | spread_legs | open_mouth | gangbang | 3boys | erection | fellatio | testicles | cum | double_handjob | male_pubic_hair | veins |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-----------------|:--------------|:--------------|:-----------|:--------------|:----------|:--------|:------------|:-------------------|:-------|:----------|:-------------|:----------------|:-------------|:---------------|:------------|:---------------|:--------------------|:---------------|:-------------|:-----------|:-----------|:---------|:----------|:--------------|:--------------------|:-------------------|:-------------|:---------|:----------------|:------------------|:--------|:---------------|:-----------------|:------------|:--------------|:--------------------|:---------------|:-----------|:---------------|:-------------|:---------|:--------|:------|:-----------|:-----------|:----------------|:---------------|:----------------|:------------|:---------|:--------|:--------------|:---------------|:---------------|:---------------|:-----------|:---------------------|:------------|:------------------|:----------------|:--------------|:------|:-----------|:--------------|:---------------|:---------|:-------------------|:----------------|:--------|:----------------|:--------------|:----------|:-------|:------------|:-------|:---------|:--------|:--------|:-------------|:----------|:-------|:------------------|:------|:--------|:--------------|:----------|:---------------------------|:-------------------|:-------------|:------------|:------------|:-----------|:-------------|:------------------|:--------------------|:-------------|:-------------|:----------|:------|:-----------|:--------------------|:---------|:----------------|:-------------|:----------|:--------------|:-------------|:-----------|:--------|:-----------|:-----------|:------------|:------|:-----------------|:------------------|:--------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | | X | X | | X | X | X | X | | X | X | X | | X | X | X | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | X | X | X | X | X | | X | X | | X | X | X | X | | X | X | X | X | X | X | X | X | | | | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | X | | | | | | X | | | X | X | | | | | | | X | X | | X | | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 63 |  |  |  |  |  | X | X | | | | | | | | | | X | | | | | | | | X | | | | | | | | X | X | | | | | | | | | | X | | X | X | X | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 27 |  |  |  |  |  | X | X | | | | | | | | | | X | | | | | | | | X | | | | | | | | X | X | | | | | | | | | | X | | | X | | | X | | | | | | | | | | X | X | X | | | X | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | | | | | X | | | | | X | | | X | | | | | X | | | | | | | | X | X | | | | | | | | | | X | | | X | | | | | | | | | | | | | X | X | X | | | | | | | | | X | X | X | | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 8 |  |  |  |  |  | X | X | | | X | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 8 | 12 |  |  |  |  |  | X | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | | | X | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | | | X | X | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 9 | 6 |  |  |  |  |  | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | X | | | | X | | | | | | X | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
nicholasbien/custom-txt | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3295001.7540148846
num_examples: 2042
- name: test
num_bytes: 824557.2459851155
num_examples: 511
download_size: 1904830
dataset_size: 4119559.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Dinite/vozisaac | ---
license: openrail
---
|
society-ethics/lila_camera_traps | ---
annotations_creators:
- expert-generated
license:
- other
language_creators:
- expert-generated
language:
- en
multilinguality:
- monolingual
size_categories:
- 10M<n<100M
source_datasets:
- original
task_categories:
- image-classification
tags:
- biodiversity
- camera trap data
- wildlife monitoring
pretty_name: LILA Camera Traps
---
# Dataset Card for LILA
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Tutorial](#tutorial)
- [Working with Taxonomies](#working-with-taxonomies)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://lila.science/
- **Repository:** N/A
- **Paper:** N/A
- **Leaderboard:** N/A
- **Point of Contact:** [info@lila.science](info@lila.science)
### Dataset Summary
LILA Camera Traps is an aggregate data set of images taken by camera traps, which are devices that automatically (e.g. via motion detection) capture images of wild animals to help ecological research.
This data set is the first time when disparate camera trap data sets have been aggregated into a single training environment with a single [taxonomy](https://lila.science/taxonomy-mapping-for-camera-trap-data-sets/).
This data set consists of only camera trap image data sets, whereas the broader [LILA](lila.science/) website also has other data sets related to biology and conservation, intended as a resource for both machine learning (ML) researchers and those that want to harness ML for this topic.
See below for information about each specific dataset that LILA contains:
<details>
<summary> Caltech Camera Traps </summary>
This data set contains 243,100 images from 140 camera locations in the Southwestern United States, with labels for 21 animal categories (plus empty), primarily at the species level (for example, the most common labels are opossum, raccoon, and coyote), and approximately 66,000 bounding box annotations. Approximately 70% of images are labeled as empty.
More information about this data set is available [here](https://beerys.github.io/CaltechCameraTraps/).
This data set is released under the [Community Data License Agreement (permissive variant)](https://cdla.io/permissive-1-0/).
For questions about this data set, contact caltechcameratraps@gmail.com.
If you use this data set, please cite the associated manuscript:
```bibtex
@inproceedings{DBLP:conf/eccv/BeeryHP18,
author = {Sara Beery and
Grant Van Horn and
Pietro Perona},
title = {Recognition in Terra Incognita},
booktitle = {Computer Vision - {ECCV} 2018 - 15th European Conference, Munich,
Germany, September 8-14, 2018, Proceedings, Part {XVI}},
pages = {472--489},
year = {2018},
crossref = {DBLP:conf/eccv/2018-16},
url = {https://doi.org/10.1007/978-3-030-01270-0\_28},
doi = {10.1007/978-3-030-01270-0\_28},
timestamp = {Mon, 08 Oct 2018 17:08:07 +0200},
biburl = {https://dblp.org/rec/bib/conf/eccv/BeeryHP18},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
</details>
<details>
<summary> ENA24 </summary>
This data set contains approximately 10,000 camera trap images representing 23 classes from Eastern North America, with bounding boxes on each image. The most common classes are “American Crow”, “American Black Bear”, and “Dog”.
This data set is released under the [Community Data License Agreement (permissive variant)](https://cdla.io/permissive-1-0/).
Please cite this manuscript if you use this data set:
```bibtex
@article{yousif2019dynamic,
title={Dynamic Programming Selection of Object Proposals for Sequence-Level Animal Species Classification in the Wild},
author={Yousif, Hayder and Kays, Roland and He, Zhihai},
journal={IEEE Transactions on Circuits and Systems for Video Technology},
year={2019},
publisher={IEEE}
}
```
For questions about this data set, contact [Hayder Yousif](hyypp5@mail.missouri.edu).
</details>
<details>
<summary> Missouri Camera Traps </summary>
This data set contains approximately 25,000 camera trap images representing 20 species (for example, the most common labels are red deer, mouflon, and white-tailed deer). Images within each sequence share the same species label (even though the animal may not have been recorded in all the images in the sequence). Around 900 bounding boxes are included. These are very challenging sequences with highly cluttered and dynamic scenes. Spatial resolutions of the images vary from 1920 × 1080 to 2048 × 1536. Sequence lengths vary from 3 to more than 300 frames.
This data set is released under the [Community Data License Agreement (permissive variant)](https://cdla.io/permissive-1-0/).
If you use this data set, please cite the associated manuscript:
```bibtex
@article{zhang2016animal,
title={Animal detection from highly cluttered natural scenes using spatiotemporal object region proposals and patch verification},
author={Zhang, Zhi and He, Zhihai and Cao, Guitao and Cao, Wenming},
journal={IEEE Transactions on Multimedia},
volume={18},
number={10},
pages={2079--2092},
year={2016},
publisher={IEEE}
}
```
For questions about this data set, contact [Hayder Yousif](hyypp5@mail.missouri.edu) and [Zhi Zhang](zzbhf@mail.missouri.edu).
</details>
<details>
<summary> North American Camera Trap Images (NACTI) </summary>
This data set contains 3.7M camera trap images from five locations across the United States, with labels for 28 animal categories, primarily at the species level (for example, the most common labels are cattle, boar, and red deer). Approximately 12% of images are labeled as empty. We have also added bounding box annotations to 8892 images (mostly vehicles and birds).
This data set is released under the [Community Data License Agreement (permissive variant)](https://cdla.io/permissive-1-0/).
Please cite this manuscript if you use this data set:
```bibtex
@article{tabak2019machine,
title={Machine learning to classify animal species in camera trap images: Applications in ecology},
author={Tabak, Michael A and Norouzzadeh, Mohammad S and Wolfson, David W and Sweeney, Steven J and VerCauteren, Kurt C and Snow, Nathan P and Halseth, Joseph M and Di Salvo, Paul A and Lewis, Jesse S and White, Michael D and others},
journal={Methods in Ecology and Evolution},
volume={10},
number={4},
pages={585--590},
year={2019},
publisher={Wiley Online Library}
}
```
For questions about this data set, contact [northamericancameratrapimages@gmail.com](northamericancameratrapimages@gmail.com).
</details>
<details>
<summary> WCS Camera Traps </summary>
This data set contains approximately 1.4M camera trap images representing around 675 species from 12 countries, making it one of the most diverse camera trap data sets available publicly. Data were provided by the [Wildlife Conservation Society](https://www.wcs.org/). The most common classes are tayassu pecari (peccary), meleagris ocellata (ocellated turkey), and bos taurus (cattle). A complete list of classes and associated image counts is available here. Approximately 50% of images are empty. We have also added approximately 375,000 bounding box annotations to approximately 300,000 of those images, which come from sequences covering almost all locations.
Sequences are inferred from timestamps, so may not strictly represent bursts. Images were labeled at a combination of image and sequence level, so – as is the case with most camera trap data sets – empty images may be labeled as non-empty (if an animal was present in one frame of a sequence but not in others). Images containing humans are referred to in metadata, but are not included in the data files. You can find more information about the data set [on the LILA website](https://lila.science/datasets/wcscameratraps).
This data set is released under the [Community Data License Agreement (permissive variant)](https://cdla.io/permissive-1-0/).
</details>
<details>
<summary> Wellington Camera Traps </summary>
This data set contains 270,450 images from 187 camera locations in Wellington, New Zealand. The cameras (Bushnell 119537, 119476, and 119436) recorded sequences of three images when triggered. Each sequence was labelled by citizen scientists and/or professional ecologists from Victoria University of Wellington into 17 classes: 15 animal categories (for example, the most common labels are bird, cat, and hedgehog), empty, and unclassifiable. Approximately 17% of images are labeled as empty. Images within each sequence share the same species label (even though the animal may not have been recorded in all three images).
If you use this data set, please cite the associated manuscript:
```bibtex
@article{anton2018monitoring,
title={Monitoring the mammalian fauna of urban areas using remote cameras and citizen science},
author={Anton, Victor and Hartley, Stephen and Geldenhuis, Andre and Wittmer, Heiko U},
journal={Journal of Urban Ecology},
volume={4},
number={1},
pages={juy002},
year={2018},
publisher={Oxford University Press}
}
```
This data set is released under the [Community Data License Agreement (permissive variant)](https://cdla.io/permissive-1-0/).
For questions about this data set, contact [Victor Anton](vykanton@gmail.com).
</details>
<details>
<summary> Island Conservation Camera Traps </summary>
This data set contains approximately 123,000 camera trap images from 123 camera locations from 7 islands in 6 countries. Data were provided by Island Conservation during projects conducted to prevent the extinction of threatened species on islands.
The most common classes are rabbit, rat, petrel, iguana, cat, goat, and pig, with both rat and cat represented between multiple island sites representing significantly different ecosystems (tropical forest, dry forest, and temperate forests). Additionally, this data set represents data from locations and ecosystems that, to our knowledge, are not well represented in publicly available datasets including >1,000 images each of iguanas, petrels, and shearwaters. A complete list of classes and associated image counts is available here. Approximately 60% of the images are empty. We have also included approximately 65,000 bounding box annotations for about 50,000 images.
In general cameras were dispersed across each project site to detect the presence of invasive vertebrate species that threaten native island species. Cameras were set to capture bursts of photos for each motion detection event (between three and eight photos) with a set delay between events (10 to 30 seconds) to minimize the number of photos. Images containing humans are referred to in metadata, but are not included in the data files.
For questions about this data set, contact [David Will](david.will@islandconservation.org) at Island Conservation.
This data set is released under the [Community Data License Agreement (permissive variant)](https://cdla.io/permissive-1-0/).
The original data set included a “human” class label; for privacy reasons, we have removed those images from this version of the data set. Those labels are still present in the metadata. If those images are important to your work, contact us; in some cases it will be possible to release those images under an alternative license.
</details>
<details>
<summary> Channel Islands Camera Traps </summary>
This data set contains 246,529 camera trap images from 73 camera locations in the Channel Islands, California. All animals are annotated with bounding boxes. Data were provided by The Nature Conservancy. Animals are classified as rodent1 (82914), fox (48150), bird (11099), skunk (1071), or other (159). 114,949 images (47%) are empty. All images of rats were taken on islands already known to have rat populations.
If you use these data in a publication or report, please use the following citation:
The Nature Conservancy (2021): Channel Islands Camera Traps 1.0. The Nature Conservancy. Dataset.
For questions about this data set, contact [Nathaniel Rindlaub](nathaniel.rindlaub@TNC.ORG) at The Nature Conservancy.
This data set is released under the [Community Data License Agreement (permissive variant)](https://cdla.io/permissive-1-0/).
The original data set included a “human” class label; for privacy reasons, we have removed those images from this version of the data set. Those labels are still present in the metadata.
</details>
<details>
<summary> Idaho Camera Traps </summary>
This data set contains approximately 1.5 million camera trap images from Idaho. Labels are provided for 62 categories, most of which are animal classes (“deer”, “elk”, and “cattle” are the most common animal classes), but labels also include some state indicators (e.g. “snow on lens”, “foggy lens”). Approximately 70.5% of images are labeled as empty. Annotations were assigned to image sequences, rather than individual images, so annotations are meaningful only at the sequence level.
The metadata contains references to images containing humans, but these have been removed from the dataset (along with images containing vehicles and domestic dogs).
Images were provided by the Idaho Department of Fish and Game. No representations or warranties are made regarding the data, including but not limited to warranties of non-infringement or fitness for a particular purpose. Some information shared under this agreement may not have undergone quality assurance procedures and should be considered provisional. Images may not be sold in any format, but may be used for scientific publications. Please acknowledge the Idaho Department of Fish and Game when using images for publication or scientific communication.
</details>
<details>
<summary> Snapshot Serengeti </summary>
This data set contains approximately 2.65M sequences of camera trap images, totaling 7.1M images, from seasons one through eleven of the [Snapshot Serengeti project](https://snapshotserengeti.org/) -- the flagship project of the Snapshot Safari network. Using the same camera trapping protocols at every site, Snapshot Safari members are collecting standardized data from many protected areas in Africa, which allows for cross-site comparisons to assess the efficacy of conservation and restoration programs. Serengeti National Park in Tanzania is best known for the massive annual migrations of wildebeest and zebra that drive the cycling of its dynamic ecosystem.
Labels are provided for 61 categories, primarily at the species level (for example, the most common labels are wildebeest, zebra, and Thomson’s gazelle). Approximately 76% of images are labeled as empty. A full list of species and associated image counts is available [here](https://lilablobssc.blob.core.windows.net/snapshotserengeti-v-2-0/SnapshotSerengeti_S1-11_v2.1.species_list.csv). We have also added approximately 150,000 bounding box annotations to approximately 78,000 of those images.
The images and species-level labels are described in more detail in the associated manuscript:
```bibtex
@misc{dryad_5pt92,
title = {Data from: Snapshot Serengeti, high-frequency annotated camera trap images of 40 mammalian species in an African savanna},
author = {Swanson, AB and Kosmala, M and Lintott, CJ and Simpson, RJ and Smith, A and Packer, C},
year = {2015},
journal = {Scientific Data},
URL = {https://doi.org/10.5061/dryad.5pt92},
doi = {doi:10.5061/dryad.5pt92},
publisher = {Dryad Digital Repository}
}
```
For questions about this data set, contact [Sarah Huebner](huebn090@umn.edu) at the University of Minnesota.
This data set is released under the [Community Data License Agreement (permissive variant)](https://cdla.io/permissive-1-0/).
</details>
<details>
<summary> Snapshot Karoo </summary>
This data set contains 14889 sequences of camera trap images, totaling 38074 images, from the [Snapshot Karoo](https://www.zooniverse.org/projects/shuebner729/snapshot-karoo) project, part of the Snapshot Safari network. Using the same camera trapping protocols at every site, Snapshot Safari members are collecting standardized data from many protected areas in Africa, which allows for cross-site comparisons to assess the efficacy of conservation and restoration programs. Karoo National Park, located in the arid Nama Karoo biome of South Africa, is defined by its endemic vegetation and mountain landscapes. Its unique topographical gradient has led to a surprising amount of biodiversity, with 58 mammals and more than 200 bird species recorded, as well as a multitude of reptilian species.
Labels are provided for 38 categories, primarily at the species level (for example, the most common labels are gemsbokoryx, hartebeestred, and kudu). Approximately 83.02% of images are labeled as empty. A full list of species and associated image counts is available [here](https://lilablobssc.blob.core.windows.net/snapshot-safari/KAR/SnapshotKaroo_S1_v1.0.species_list.csv).
For questions about this data set, contact [Sarah Huebner](huebn090@umn.edu) at the University of Minnesota.
This data set is released under the [Community Data License Agreement (permissive variant)](https://cdla.io/permissive-1-0/).
</details>
<details>
<summary> Snapshot Kgalagadi </summary>
This data set contains 3611 sequences of camera trap images, totaling 10222 images, from the [Snapshot Kgalagadi](https://www.zooniverse.org/projects/shuebner729/snapshot-kgalagadi/) project, part of the Snapshot Safari network. Using the same camera trapping protocols at every site, Snapshot Safari members are collecting standardized data from many protected areas in Africa, which allows for cross-site comparisons to assess the efficacy of conservation and restoration programs. The Kgalagadi Transfrontier Park stretches from the Namibian border across South Africa and into Botswana, covering a landscape commonly referred to as the Kalahari – an arid savanna. This region is of great interest to help us understand how animals cope with extreme temperatures at both ends of the scale.
Labels are provided for 31 categories, primarily at the species level (for example, the most common labels are gemsbokoryx, birdother, and ostrich). Approximately 76.14% of images are labeled as empty. A full list of species and associated image counts is available [here](https://lilablobssc.blob.core.windows.net/snapshot-safari/KGA/SnapshotKgalagadi_S1_v1.0.species_list.csv).
For questions about this data set, contact [Sarah Huebner](huebn090@umn.edu) at the University of Minnesota.
This data set is released under the [Community Data License Agreement (permissive variant)](https://cdla.io/permissive-1-0/).
</details>
<details>
<summary> Snapshot Enonkishu </summary>
This data set contains 13301 sequences of camera trap images, totaling 28544 images, from the [Snapshot Enonkishu](https://www.zooniverse.org/projects/aguthmann/snapshot-enonkishu) project, part of the Snapshot Safari network. Using the same camera trapping protocols at every site, Snapshot Safari members are collecting standardized data from many protected areas in Africa, which allows for cross-site comparisons to assess the efficacy of conservation and restoration programs. Enonkishu Conservancy is located on the northern boundary of the Mara-Serengeti ecosystem in Kenya, and is managed by a consortium of stakeholders and land-owning Maasai families. Their aim is to promote coexistence between wildlife and livestock in order to encourage regenerative grazing and build stability in the Mara conservancies.
Labels are provided for 39 categories, primarily at the species level (for example, the most common labels are impala, warthog, and zebra). Approximately 64.76% of images are labeled as empty. A full list of species and associated image counts is available [here](https://lilablobssc.blob.core.windows.net/snapshot-safari/ENO/SnapshotEnonkishu_S1_v1.0.species_list.csv).
For questions about this data set, contact [Sarah Huebner](huebn090@umn.edu) at the University of Minnesota.
This data set is released under the [Community Data License Agreement (permissive variant)](https://cdla.io/permissive-1-0/).
</details>
<details>
<summary> Snapshot Camdeboo </summary>
This data set contains 12132 sequences of camera trap images, totaling 30227 images, from the [Snapshot Camdeboo](https://www.zooniverse.org/projects/shuebner729/snapshot-camdeboo) project, part of the Snapshot Safari network. Using the same camera trapping protocols at every site, Snapshot Safari members are collecting standardized data from many protected areas in Africa, which allows for cross-site comparisons to assess the efficacy of conservation and restoration programs. Camdeboo National Park, South Africa is crucial habitat for many birds on a global scale, with greater than fifty endemic and near-endemic species and many migratory species.
Labels are provided for 43 categories, primarily at the species level (for example, the most common labels are kudu, springbok, and ostrich). Approximately 43.74% of images are labeled as empty. A full list of species and associated image counts is available [here](https://lilablobssc.blob.core.windows.net/snapshot-safari/CDB/SnapshotCamdeboo_S1_v1.0.species_list.csv).
For questions about this data set, contact [Sarah Huebner](huebn090@umn.edu) at the University of Minnesota.
This data set is released under the [Community Data License Agreement (permissive variant)](https://cdla.io/permissive-1-0/).
</details>
<details>
<summary> Snapshot Mountain Zebra </summary>
This data set contains 71688 sequences of camera trap images, totaling 73034 images, from the [Snapshot Mountain Zebra](https://www.zooniverse.org/projects/meredithspalmer/snapshot-mountain-zebra/) project, part of the Snapshot Safari network. Using the same camera trapping protocols at every site, Snapshot Safari members are collecting standardized data from many protected areas in Africa, which allows for cross-site comparisons to assess the efficacy of conservation and restoration programs. Mountain Zebra National Park is located in the Eastern Cape of South Africa in a transitional area between several distinct biomes, which means it is home to many endemic species. As the name suggests, this park contains the largest remnant population of Cape Mountain zebras, ~700 as of 2019 and increasing steadily every year.
Labels are provided for 54 categories, primarily at the species level (for example, the most common labels are zebramountain, kudu, and springbok). Approximately 91.23% of images are labeled as empty. A full list of species and associated image counts is available [here](https://lilablobssc.blob.core.windows.net/snapshot-safari/MTZ/SnapshotMountainZebra_S1_v1.0.species_list.csv).
For questions about this data set, contact [Sarah Huebner](huebn090@umn.edu) at the University of Minnesota.
This data set is released under the [Community Data License Agreement (permissive variant)](https://cdla.io/permissive-1-0/).
</details>
<details>
<summary> Snapshot Kruger </summary>
This data set contains 4747 sequences of camera trap images, totaling 10072 images, from the [Snapshot Kruger](https://www.zooniverse.org/projects/shuebner729/snapshot-kruger) project, part of the Snapshot Safari network. Using the same camera trapping protocols at every site, Snapshot Safari members are collecting standardized data from many protected areas in Africa, which allows for cross-site comparisons to assess the efficacy of conservation and restoration programs. Kruger National Park, South Africa has been a refuge for wildlife since its establishment in 1898, and it houses one of the most diverse wildlife assemblages remaining in Africa. The Snapshot Safari grid was established in 2018 as part of a research project assessing the impacts of large mammals on plant life as boundary fences were removed and wildlife reoccupied areas of previous extirpation.
Labels are provided for 46 categories, primarily at the species level (for example, the most common labels are impala, elephant, and buffalo). Approximately 61.60% of images are labeled as empty. A full list of species and associated image counts is available [here](https://lilablobssc.blob.core.windows.net/snapshot-safari/KRU/SnapshotKruger_S1_v1.0.species_list.csv).
For questions about this data set, contact [Sarah Huebner](huebn090@umn.edu) at the University of Minnesota.
This data set is released under the [Community Data License Agreement (permissive variant)](https://cdla.io/permissive-1-0/).
</details>
<details>
<summary> SWG Camera Traps </summary>
This data set contains 436,617 sequences of camera trap images from 982 locations in Vietnam and Lao, totaling 2,039,657 images. Labels are provided for 120 categories, primarily at the species level (for example, the most common labels are “Eurasian Wild Pig”, “Large-antlered Muntjac”, and “Unidentified Murid”). Approximately 12.98% of images are labeled as empty. A full list of species and associated image counts is available here. 101,659 bounding boxes are provided on 88,135 images.
This data set is provided by the Saola Working Group; providers include:
- IUCN SSC Asian Wild Cattle Specialist Group’s Saola Working Group (SWG)
- Asian Arks
- Wildlife Conservation Society (Lao)
- WWF Lao
- Integrated Conservation of Biodiversity and Forests project, Lao (ICBF)
- Center for Environment and Rural Development, Vinh University, Vietnam
If you use these data in a publication or report, please use the following citation:
SWG (2021): Northern and Central Annamites Camera Traps 2.0. IUCN SSC Asian Wild Cattle Specialist Group’s Saola Working Group. Dataset.
For questions about this data set, contact saolawg@gmail.com.
This data set is released under the [Community Data License Agreement (permissive variant)](https://cdla.io/permissive-1-0/).
</details>
<details>
<summary> Orinoquia Camera Traps </summary>
This data set contains 104,782 images collected from a 50-camera-trap array deployed from January to July 2020 within the private natural reserves El Rey Zamuro (31 km2) and Las Unamas (40 km2), located in the Meta department in the Orinoquía region in central Colombia. We deployed cameras using a stratified random sampling design across forest core area strata. Cameras were spaced 1 km apart from one another, located facing wildlife trails, and deployed with no bait. Images were stored and reviewed by experts using the Wildlife Insights platform.
This data set contains 51 classes, predominantly mammals such as the collared peccary, black agouti, spotted paca, white-lipped peccary, lowland tapir, and giant anteater. Approximately 20% of images are empty.
The main purpose of the study is to understand how humans, wildlife, and domestic animals interact in multi-functional landscapes (e.g., agricultural livestock areas with native forest remnants). However, this data set was also used to review model performance of AI-powered platforms – Wildlife Insights (WI), MegaDetector (MD), and Machine Learning for Wildlife Image Classification (MLWIC2). We provide a demonstration of the use of WI, MD, and MLWIC2 and R code for evaluating model performance of these platforms in the accompanying [GitHub repository](https://github.com/julianavelez1/Processing-Camera-Trap-Data-Using-AI).
If you use these data in a publication or report, please use the following citation:
```bibtex
@article{velez2022choosing,
title={Choosing an Appropriate Platform and Workflow for Processing Camera Trap Data using Artificial Intelligence},
author={V{\'e}lez, Juliana and Castiblanco-Camacho, Paula J and Tabak, Michael A and Chalmers, Carl and Fergus, Paul and Fieberg, John},
journal={arXiv preprint arXiv:2202.02283},
year={2022}
}
```
For questions about this data set, contact [Juliana Velez Gomez](julianavelezgomez@gmail.com).
This data set is released under the [Community Data License Agreement (permissive variant)](https://cdla.io/permissive-1-0/).
</details>
### Supported Tasks and Leaderboards
No leaderboards exist for LILA.
### Languages
The [LILA taxonomy](https://lila.science/taxonomy-mapping-for-camera-trap-data-sets/) is provided in English.
## Dataset Structure
### Data Instances
The data annotations are provided in [COCO Camera Traps](https://github.com/Microsoft/CameraTraps/blob/master/data_management/README.md#coco-cameratraps-format) format.
All of the datasets share a common category taxonomy, which is defined on the [LILA website](https://lila.science/taxonomy-mapping-for-camera-trap-data-sets/).
### Data Fields
Different datasets may have slightly varying fields, which include:
`file_name`: the file name \
`width` and `height`: the dimensions of the image \
`study`: which research study the image was collected as part of \
`location` : the name of the location at which the image was taken \
`annotations`: information about image annotation, which includes the taxonomy information, bounding box/boxes (`bbox`/`bboxes`) if any, as well as any other annotation information. \
`image` : the `path` to download the image and any other information that is available, e.g. its size in `bytes`.
### Data Splits
This dataset does not have a predefined train/test split.
## Dataset Creation
### Curation Rationale
The datasets that constitute LILA have been provided by the organizations, projects and researchers who collected them.
### Source Data
#### Initial data collection and normalization
N/A
#### Who are the source language producers?
N/A
### Annotations
#### Annotation process
Each dataset has been annotated by the members of the project/organization that provided it.
#### Who are the annotators?
The annotations have been provided by domain experts in fields such as biology and ecology.
### Personal and Sensitive Information
Some of the original data sets included a “human” class label; for privacy reasons, these images were removed. Those labels are still present in the metadata. If those images are important to your work, contact the [LILA maintainers](mailto:info@lila.science), since in some cases it will be possible to release those images under an alternative license.
## Considerations for Using the Data
### Social Impact of Dataset
Machine learning depends on labeled data, but accessing such data in biology and conservation is a challenge. Consequently, everyone benefits when labeled data is made available. Biologists and conservation scientists benefit by having data to train on, and free hosting allows teams to multiply the impact of their data (we suggest listing this benefit in grant proposals that fund data collection). ML researchers benefit by having data to experiment with.
### Discussion of Biases
These datasets do not represent global diversity, but are examples of local ecosystems and animals.
### Other Known Limitations
N/A
## Additional Information
### Tutorial
The [tutorial in this Google Colab notebook](https://colab.research.google.com/drive/17gPOIK-ksxPyX6yP9TaKIimlwf9DYe2R?usp=sharing) demonstrates how to work with this dataset, including filtering by species, collating configurations, and downloading images.
### Working with Taxonomies
All the taxonomy categories are saved as ClassLabels, which can be converted to strings as needed. Strings can likewise be converted to integers as needed, to filter the dataset. In the example below we filter the "Caltech Camera Traps" dataset to find all the entries with a "felis catus" as the species for the first annotation.
```python
dataset = load_dataset("society-ethics/lila_camera_traps", "Caltech Camera Traps", split="train")
taxonomy = dataset.features["annotations"].feature["taxonomy"]
# Filters to show only cats
cats = dataset.filter(lambda x: x["annotations"]["taxonomy"][0]["species"] == taxonomy["species"].str2int("felis catus"))
```
The original common names have been saved with their taxonomy mappings in this repository in `common_names_to_tax.json`. These can be used, for example, to map from a taxonomy combination to a common name to help make queries more legible. Note, however, that there is a small number of duplicate common names with different taxonomy values which you will need to disambiguate.
The following example loads the first "sea turtle" in the "Island Conservation Camera Traps" dataset.
```python
LILA_COMMON_NAMES_TO_TAXONOMY = pd.read_json("https://huggingface.co/datasets/society-ethics/lila_camera_traps/raw/main/data/common_names_to_tax.json", lines=True).set_index("common_name")
dataset = load_dataset("society-ethics/lila_camera_traps", "Island Conservation Camera Traps", split="train")
taxonomy = dataset.features["annotations"].feature["taxonomy"]
sea_turtle = LILA_COMMON_NAMES_TO_TAXONOMY.loc["sea turtle"].to_dict()
sea_turtle = {k: taxonomy[k].str2int(v) if v is not None else v for k, v in sea_turtle.items()} # Map to ClassLabel integers
sea_turtle_dataset = ds.filter(lambda x: x["annotations"]["taxonomy"][0] == sea_turtle)
```
The example below selects a random item from the dataset, and then maps from the taxonomy to a common name:
```python
LILA_COMMON_NAMES_TO_TAXONOMY = pd.read_json("https://huggingface.co/datasets/society-ethics/lila_camera_traps/raw/main/data/common_names_to_tax.json", lines=True).set_index("common_name")
dataset = load_dataset("society-ethics/lila_camera_traps", "Caltech Camera Traps", split="train")
taxonomy = dataset.features["annotations"].feature["taxonomy"]
random_entry = dataset.shuffle()[0]
filter_taxonomy = random_entry["annotations"]["taxonomy"][0]
filter_keys = list(map(lambda x: (x[0], taxonomy[x[0]].int2str(x[1])), filter(lambda x: x[1] is not None, list(filter_taxonomy.items()))))
if len(filter_keys) > 0:
print(LILA_COMMON_NAMES_TO_TAXONOMY[np.logical_and.reduce([
LILA_COMMON_NAMES_TO_TAXONOMY[k] == v for k,v in filter_keys
])])
else:
print("No common name found for the item.")
```
### Dataset Curators
LILA BC is maintained by a working group that includes representatives from Ecologize, Zooniverse, the Evolving AI Lab, Snapshot Safari, and Microsoft AI for Earth. Hosting on Microsoft Azure is provided by Microsoft AI for Earth.
### Licensing Information
Many, but not all, LILA data sets were released under the [Community Data License Agreement (permissive variant)](https://cdla.io/permissive-1-0/). Check the details of the specific dataset you are using in its section above.
### Citation Information
Citations for each dataset (if they exist) are provided in its section above.
### Contributions
Thanks to [@NimaBoscarino](https://github.com/NimaBoscarino/) for adding this dataset.
|
open-llm-leaderboard/details_migtissera__Tess-10.7B-v1.5 | ---
pretty_name: Evaluation run of migtissera/Tess-10.7B-v1.5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/Tess-10.7B-v1.5](https://huggingface.co/migtissera/Tess-10.7B-v1.5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Tess-10.7B-v1.5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-27T08:16:07.104140](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-10.7B-v1.5/blob/main/results_2024-01-27T08-16-07.104140.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6513655424521041,\n\
\ \"acc_stderr\": 0.03165794538741113,\n \"acc_norm\": 0.6541051437423594,\n\
\ \"acc_norm_stderr\": 0.032296434557489546,\n \"mc1\": 0.32558139534883723,\n\
\ \"mc1_stderr\": 0.01640398946990783,\n \"mc2\": 0.47430080710659894,\n\
\ \"mc2_stderr\": 0.014677705750823734\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.60580204778157,\n \"acc_stderr\": 0.014280522667467325,\n\
\ \"acc_norm\": 0.6501706484641638,\n \"acc_norm_stderr\": 0.013936809212158289\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6490738896634136,\n\
\ \"acc_stderr\": 0.004762844770909862,\n \"acc_norm\": 0.8406691894045011,\n\
\ \"acc_norm_stderr\": 0.0036523632532895916\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.03459777606810536,\n\
\ \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.03459777606810536\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.69,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438665,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438665\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.035506839891655796,\n\
\ \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.035506839891655796\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n\
\ \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n\
\ \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n\
\ \"acc_stderr\": 0.032081157507886836,\n \"acc_norm\": 0.5957446808510638,\n\
\ \"acc_norm_stderr\": 0.032081157507886836\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n\
\ \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"\
acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.02542483508692399,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.02542483508692399\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.02275520495954294,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.02275520495954294\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n\
\ \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822513,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822513\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.01871899852067819,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.01871899852067819\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.024321738484602357,\n\
\ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.024321738484602357\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342853,\n\
\ \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342853\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5787037037037037,\n \"acc_stderr\": 0.03367462138896078,\n \"\
acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.03367462138896078\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8481012658227848,\n \"acc_stderr\": 0.02336387809663245,\n \
\ \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.02336387809663245\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n\
\ \"acc_stderr\": 0.03036037971029195,\n \"acc_norm\": 0.7130044843049327,\n\
\ \"acc_norm_stderr\": 0.03036037971029195\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596915,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596915\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.03957835471980981,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.03957835471980981\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29497206703910617,\n\
\ \"acc_stderr\": 0.015251931579208181,\n \"acc_norm\": 0.29497206703910617,\n\
\ \"acc_norm_stderr\": 0.015251931579208181\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7712418300653595,\n \"acc_stderr\": 0.024051029739912258,\n\
\ \"acc_norm\": 0.7712418300653595,\n \"acc_norm_stderr\": 0.024051029739912258\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142246,\n \"\
acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142246\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48565840938722293,\n\
\ \"acc_stderr\": 0.01276498182952427,\n \"acc_norm\": 0.48565840938722293,\n\
\ \"acc_norm_stderr\": 0.01276498182952427\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7279411764705882,\n \"acc_stderr\": 0.02703304115168146,\n\
\ \"acc_norm\": 0.7279411764705882,\n \"acc_norm_stderr\": 0.02703304115168146\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.02737294220178816,\n\
\ \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.02737294220178816\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.03015113445777634,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.03015113445777634\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533193,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533193\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32558139534883723,\n\
\ \"mc1_stderr\": 0.01640398946990783,\n \"mc2\": 0.47430080710659894,\n\
\ \"mc2_stderr\": 0.014677705750823734\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.010470796496781091\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5435936315390447,\n \
\ \"acc_stderr\": 0.013720038270485327\n }\n}\n```"
repo_url: https://huggingface.co/migtissera/Tess-10.7B-v1.5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|arc:challenge|25_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|gsm8k|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hellaswag|10_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T08-16-07.104140.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T08-16-07.104140.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- '**/details_harness|winogrande|5_2024-01-27T08-16-07.104140.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-27T08-16-07.104140.parquet'
- config_name: results
data_files:
- split: 2024_01_27T08_16_07.104140
path:
- results_2024-01-27T08-16-07.104140.parquet
- split: latest
path:
- results_2024-01-27T08-16-07.104140.parquet
---
# Dataset Card for Evaluation run of migtissera/Tess-10.7B-v1.5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [migtissera/Tess-10.7B-v1.5](https://huggingface.co/migtissera/Tess-10.7B-v1.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Tess-10.7B-v1.5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-27T08:16:07.104140](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-10.7B-v1.5/blob/main/results_2024-01-27T08-16-07.104140.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6513655424521041,
"acc_stderr": 0.03165794538741113,
"acc_norm": 0.6541051437423594,
"acc_norm_stderr": 0.032296434557489546,
"mc1": 0.32558139534883723,
"mc1_stderr": 0.01640398946990783,
"mc2": 0.47430080710659894,
"mc2_stderr": 0.014677705750823734
},
"harness|arc:challenge|25": {
"acc": 0.60580204778157,
"acc_stderr": 0.014280522667467325,
"acc_norm": 0.6501706484641638,
"acc_norm_stderr": 0.013936809212158289
},
"harness|hellaswag|10": {
"acc": 0.6490738896634136,
"acc_stderr": 0.004762844770909862,
"acc_norm": 0.8406691894045011,
"acc_norm_stderr": 0.0036523632532895916
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.03459777606810536,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.03459777606810536
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438665,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438665
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.035506839891655796,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.035506839891655796
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.032081157507886836,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.032081157507886836
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.02542483508692399,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.02542483508692399
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.02275520495954294,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02275520495954294
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822513,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822513
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.01871899852067819,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.01871899852067819
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.024321738484602357,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.024321738484602357
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.029719142876342853,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.029719142876342853
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.02336387809663245,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.02336387809663245
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.03036037971029195,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.03036037971029195
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596915,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596915
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.03957835471980981,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.03957835471980981
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29497206703910617,
"acc_stderr": 0.015251931579208181,
"acc_norm": 0.29497206703910617,
"acc_norm_stderr": 0.015251931579208181
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7712418300653595,
"acc_stderr": 0.024051029739912258,
"acc_norm": 0.7712418300653595,
"acc_norm_stderr": 0.024051029739912258
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142246,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142246
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48565840938722293,
"acc_stderr": 0.01276498182952427,
"acc_norm": 0.48565840938722293,
"acc_norm_stderr": 0.01276498182952427
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7279411764705882,
"acc_stderr": 0.02703304115168146,
"acc_norm": 0.7279411764705882,
"acc_norm_stderr": 0.02703304115168146
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7591836734693878,
"acc_stderr": 0.02737294220178816,
"acc_norm": 0.7591836734693878,
"acc_norm_stderr": 0.02737294220178816
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.03015113445777634,
"acc_norm": 0.9,
"acc_norm_stderr": 0.03015113445777634
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533193,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533193
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32558139534883723,
"mc1_stderr": 0.01640398946990783,
"mc2": 0.47430080710659894,
"mc2_stderr": 0.014677705750823734
},
"harness|winogrande|5": {
"acc": 0.8334648776637726,
"acc_stderr": 0.010470796496781091
},
"harness|gsm8k|5": {
"acc": 0.5435936315390447,
"acc_stderr": 0.013720038270485327
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jirong/grit_2m | ---
license: apache-2.0
---
|
CyberHarem/jacques_de_molay_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of jacques_de_molay/ジャック・ド・モレー/雅克·德·莫莱 (Fate/Grand Order)
This is the dataset of jacques_de_molay/ジャック・ド・モレー/雅克·德·莫莱 (Fate/Grand Order), containing 122 images and their tags.
The core tags of this character are `short_hair, grey_hair, glasses, breasts, large_breasts, blue_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 122 | 202.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jacques_de_molay_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 122 | 173.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jacques_de_molay_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 314 | 348.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/jacques_de_molay_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/jacques_de_molay_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, horns, smile, solo, yellow_eyes, cleavage, grey_skin, looking_at_viewer, choker, collarbone, detached_sleeves, open_mouth, thighs |
| 1 | 7 |  |  |  |  |  | bare_shoulders, black_dress, cleavage, collarbone, grey_skin, horns, looking_at_viewer, yellow_eyes, 1girl, solo, choker, detached_sleeves, smile, short_dress, simple_background, sword, thighs |
| 2 | 8 |  |  |  |  |  | 1girl, belt, black_dress, black_jacket, cleavage, cropped_jacket, hooded_jacket, long_sleeves, looking_at_viewer, open_jacket, short_dress, smile, sheep, thighs, sword, purple_eyes |
| 3 | 5 |  |  |  |  |  | 1girl, belt, black_dress, black_jacket, cropped_jacket, hooded_jacket, long_sleeves, looking_at_viewer, open_jacket, open_mouth, short_dress, solo, sword, thighs, cleavage, smile, yellow_eyes |
| 4 | 6 |  |  |  |  |  | 1girl, belt, black_dress, black_jacket, cleavage, cropped_jacket, hooded_jacket, long_sleeves, looking_at_viewer, open_jacket, short_dress, solo, sword, smile, sheath |
| 5 | 9 |  |  |  |  |  | 1girl, black_dress, black_jacket, cropped_jacket, hooded_jacket, long_sleeves, looking_at_viewer, open_jacket, solo, cleavage, smile, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_dress | horns | smile | solo | yellow_eyes | cleavage | grey_skin | looking_at_viewer | choker | collarbone | detached_sleeves | open_mouth | thighs | short_dress | simple_background | sword | belt | black_jacket | cropped_jacket | hooded_jacket | long_sleeves | open_jacket | sheep | purple_eyes | sheath | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------|:--------|:--------|:-------|:--------------|:-----------|:------------|:--------------------|:---------|:-------------|:-------------------|:-------------|:---------|:--------------|:--------------------|:--------|:-------|:---------------|:-----------------|:----------------|:---------------|:--------------|:--------|:--------------|:---------|:-------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | | X | | X | | | X | | X | | | | | X | X | | X | X | X | X | X | X | X | X | X | | |
| 3 | 5 |  |  |  |  |  | X | | X | | X | X | X | X | | X | | | | X | X | X | | X | X | X | X | X | X | X | | | | |
| 4 | 6 |  |  |  |  |  | X | | X | | X | X | | X | | X | | | | | | X | | X | X | X | X | X | X | X | | | X | |
| 5 | 9 |  |  |  |  |  | X | | X | | X | X | | X | | X | | | | | | | | | | X | X | X | X | X | | | | X |
|
Multimodal-Fatima/VQAv2_test_split_8 | ---
dataset_info:
features:
- name: question_type
dtype: string
- name: multiple_choice_answer
dtype: string
- name: answers
sequence: string
- name: answers_original
list:
- name: answer
dtype: string
- name: answer_confidence
dtype: string
- name: answer_id
dtype: int64
- name: id_image
dtype: int64
- name: answer_type
dtype: string
- name: question_id
dtype: int64
- name: question
dtype: string
- name: image
dtype: image
- name: id
dtype: int64
- name: clip_tags_ViT_L_14
sequence: string
- name: blip_caption
dtype: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes
list:
- name: attribute
dtype: string
- name: box
sequence: float32
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float32
- name: size
dtype: string
- name: tag
dtype: string
- name: Attributes_ViT_L_14_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_ViT_L_14_wo_openai
sequence: string
- name: clip_tags_ViT_L_14_with_openai
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_wo_openai
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_with_openai
sequence: string
- name: clip_tags_LAION_ViT_bigG_14_2B_wo_openai
sequence: string
- name: clip_tags_LAION_ViT_bigG_14_2B_with_openai
sequence: string
- name: Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: Attributes_LAION_ViT_bigG_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_ViT_B_16_with_openai
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: captions_module
sequence: string
- name: captions_module_filter
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
splits:
- name: test
num_bytes: 9157441080.0
num_examples: 44779
download_size: 1848746963
dataset_size: 9157441080.0
---
# Dataset Card for "VQAv2_test_split_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
crumb/tiny-slimpajama-k8-00001 | ---
dataset_info:
features:
- name: text
dtype: string
- name: meta
dtype: string
- name: __index_level_0__
dtype: int64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 25963772232
num_examples: 5899634
download_size: 15090997326
dataset_size: 25963772232
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HSJuan/korquad-aug-valid | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: pos_aug
dtype: string
- name: neg_aug
dtype: string
splits:
- name: validation
num_bytes: 9298037
num_examples: 5774
download_size: 1728913
dataset_size: 9298037
---
# Dataset Card for "korquad-aug-valid"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tartuNLP/liv4ever | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
- liv
license:
- cc-by-nc-sa-4.0
multilinguality:
- translation
size_categories:
- unknown
source_datasets:
- original
task_categories:
- text2text-generation
- translation
task_ids: []
pretty_name: Liv4ever
language_bcp47:
- en-US
- liv
tags:
- conditional-text-generation
---
# liv4ever v1
This is the Livonian 4-lingual parallel corpus. Livonian is a Uralic / Finnic language with just about 20 fluent speakers and no native speakers (as of 2021). The texts and translations in this corpus were collected from all the digital text resources that could be found by the authors; scanned and printed materials are left for future work.
The corpus includes parallel data for Livonian-Latvian, Livonian-Estonian and Livonian-English; the data has been collected in 2021. After retrieval it was normalized in terms of different orthographies of Livonian and manually sentence-aligned where needed. It was collected from the following sources, with sentence counts per language pair:
* Dictionary - example sentences from the Livonian-Latvian-Estonian dictionary;
* liv-lv: 10'388,
* liv-et: 10'378
* Stalte - the alphabet book by Kōrli Stalte, translated into Estonian and Latvian;
* liv-lv: 842,
* liv-et: 685
* Poetry - the poetry collection book "Ma võtan su õnge, tursk / Ma akūb sīnda vizzõ, tūrska", with Estonian translations;
* liv-et: 770
* Vääri - the book by Eduard Vääri about Livonian language and culture;
* liv-et: 592
* Satversme - translations of the Latvian Constitution into Livonian, Estonian and English;
* liv-en: 380,
* liv-lv: 414,
* liv-et: 413
* Facebook - social media posts by the Livonian Institute and Livonian Days with original translations;
* liv-en: 123,
* liv-lv: 124,
* liv-et: 7
* JEFUL - article abstracts from the Journal of Estonian and Finno-Ugric Linguistics, special issues dedicated to Livonian studies, translated into Estonian and English;
* liv-en: 36,
* liv-et: 49
* Trilium - the book with a collection of Livonian poetry, foreword and afterword translated into Estonian and Latvian;
* liv-lv: 51,
* liv-et: 53
* Songs - material crawled off lyricstranslate.com;
* liv-en: 54,
* liv-lv: 54,
* liv-fr: 31 |
TuringsSolutions/PFAF-Function | ---
license: mit
---
|
kaleemWaheed/twitter_dataset_1713138543 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 13116
num_examples: 32
download_size: 10247
dataset_size: 13116
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.